glaurung: (Default)
Another thinky post. This one will be short. First, a shout-out to [personal profile] conuly who kindly drew attention to my error in the last post about food insecurity, and also gave me an invaluable new vocabulary term: "Food Collecting Peoples" instead of "Hunter Gatherers" does away with the "hunter" label that brings up sexist and inaccurate ideas about how those people lived.

I made a mistake in the first "extra stuff" note I put in the comments to the last thinky post about the invention of agriculture and whether inequality and war are necessarily linked to "civilization."

It's a very common assumption that food collecting peoples live a more precarious and food insecure existence than farmers - that they are more in danger of starving to death. This has generated not just harmless mistakes like my footnote but scads of bad science based on this assumption, such as the harmful "Thrifty Genotype" hypothesis among dieticians which assumes that gaining weight gain happens because humans are adapted to survive alternating waves of feast and famine, and thus self-starvation through dieting is the only proper way to address obesity (the more I learn about diet and obesity science the more I learn just how wrongheaded and discriminatory the entire discipline is).

In fact, food collecting peoples were *not* more prone to suffer food insecurity than farming peoples. Three separate analyses using the Standard Cross-Cultural Sample (TIL that there is a standardized data set for making cross cultural analyses, which is tailored to eliminate similarities due to cultural borrowing by limiting itself to cultures that are widely separated in space/time) found that this is in fact not the case. Comparing across all cultures in the sample, there's no difference in food insecurity between farmers and food collectors. If you control for climate (because arctic food collectors like the Inuit *are* more food insecure and in danger of starvation), then food collectors are *less* likely to be food insecure than farmers. Farmers are tied to their land and at the mercy of whatever happens to their crops, but food collectors can pick up and move to a different area, or simply switch to a different food source that was not impacted by the drought/flood/whatever (here is the only open access article of the three. It's the most recent and footnotes 19 and 20 link to the other two articles. CW: the article's discussion centres the obesity research angle).

The myth of food collectors' food insecurity is mostly born of prejudice (the assumption that life "in the state of nature" was "nasty brutish and short" dates back at least to the 17th century). Some of it is due to selective noticing of the data: famine and food insecurity was at least sometimes an issue for both food collectors and farmers. And some of it traces back to the artificially created food insecurity of people under colonialism and post-colonialism, which the colonizers always blamed on the victims rather than admitting their role. (having your land arbitrarily chopped into blocks that you're not allowed to go onto is not good for food collectors, even before we add the colonizers actively murdering them). Even today, most google hits for "hunter gatherer food insecurity" are papers and articles about how *former* food collecting cultures are suffering food insecurity now that they have largely ceased their own collecting practices and come to rely on food distribution by the nations in which they live. And, finally, at least in my case, some of the myth is due to wrongfully applying the special case of arctic peoples (the one climate where agriculture is impossible and food collectors do suffer from increased food insecurity) too broadly - I remembered reading about how Nanook (of the 20's documentary Nanook of the North) starved to death a few years after the film was made, which was explained as not unusual among the Inuit, and I took that as confirmation that I could accept the received myth and didn't have to google yet another fact.

So, with that myth debunked, why, exactly, did food collecting people switch to farming over most of the world? Farming is measurably worse by almost every metric: more work for the same or greater food insecurity, with more disease, worse nutrition (from a less varied diet), shorter lifespans, shorter adult stature, etc.

That farming made you more susceptible to disease was not evident to premodern people lacking tools to make statistical analyses, but at least some of the consequences of farmers' ill health *were* visible. An example off the top of my head (from the book 1491 by Charles Mann): early European accounts of First Nations people mentioned how healthy, tall, robust, and handsome Indians were compared to the malnourished, disease ridden Europeans. Most of the Indians in question were farmers themselves, but they had a broader, more nutritionally complete set of crops and, without domesticated animals, they were relatively disease-free. If the bigoted Europeans noticed and commented on the difference between better-nourished, disease free First Nations farmers and themselves, then food collectors must have noticed differences in health between themselves and the farmers whose technology they adopted.

First, in some places, food collectors didn't switch so much as get assimilated by farmers who moved into the area - genetic analysis of human remains from central Europe shows that the switch from food collecting to farming involved a genetic change, with an influx of people showing some degree of Anatolian ancestry moving in with their farming technology and mating with the local food collectors. But in other areas that genetic shift does not occur (the food collectors of the Baltic states, for instance, adopted the agricultural technology but did not interbreed with the people that brought it to them).

Second, depending on how violent that assimilation process was, people like those Baltic food collectors might have adopted farming in self-defence, regardless of the downsides.

Elsewhere, for instance in North America, there's clear evidence of agricultural technology diffusing without attendant migration, so: no assimilation or threat of assimilation. Why switch to a food system that required more work, had visible negative effects on the people who adopted it, and provided no real improvement in food security?

One common answer is that they were forced to by population pressure. (eg, Jared Diamond) This is Malthusian bullshit (another thinky post about Malthus being completely wrong will happen someday). Humans have always had the ability to limit their family sizes. Population only increased when technological change made it possible to reduce the land area per person. Look at the times between those technological shifts, and population remains extremely stable with little to no growth for vast stretches of time. There was no Malthusian pressure on food collectors to increase their food supply. Population increases happened after they changed their technology, not before.

Another answer I've seen mooted is that agriculturalists live settled lives and that enables them to accumulate more belongings and become richer than nomads. This overlooks the vast number of settled food collecting societies, where rich natural food sources meant people could live in one place permanently and own lots of things, without becoming farmers. It also overlooks that nomadic food collectors had a home range with which they were deeply familiar, and a limited number of home camps that they visited at more or less set times, depending on the season and what food sources were due to become collectable where. They could cache belongings at those camps and not have to limit themselves to what they could carry. So they weren't necessarily as poor and bereft of possessions as the popular conception of them would have.

A lot of the links I get when googling for reasons that food collectors switched to farming focus on the *invention* of farming, and provide the suggestion that this was done because settled food collectors in naturally rich areas (like the parts of the fertile crescent where wheat farming was invented) had to either become unsettled or invent new ways to get food when the place they had been living became less rich, whether due to climate change or over-exploitation. Which is not at all in accord with what we know about the actual time lines of plant domestication, extending as they do back to the height of the last glacial period, so that the food collectors were perfecting agriculture while the climate was improving and the richness of their homeland was increasing (see my previous thinky, linked above).

To restate the question: of the food collectors who had the choice to adopt already-invented farming technology (many desert/steppe dwellers and all arctic people did not have that choice, nor did those who adopted it under the threat of assimilation), some did not adopt the new tech, or resisted doing so until colonialism/invasion took away their choice (maybe because they saw evidence of the many downsides of farming). Others accepted the choice, despite those visible downsides: why? I still haven't found a reason proposed that sits well with me. But I do have a crackpot theory of my own.

Maybe, just maybe, it was because while agriculture did not provide any real benefit to settled food collectors, it did give the *appearance* of benefit. It gave the illusion of control: it made the people who did it feel that they were better able to ward off bad times because as a farmer, you were creating your own food, instead of being dependant on the forces of nature to provide food for you. Food collecting meant being at the mercy of countless factors beyond your control or ken. Farming meant being at the mercy of just one: rainfall. It wasn't actually better than food collecting, but it felt better, because *it was less scary*, and that's why it proved so popular.
glaurung: (Default)
Another thinky post. This one has been brewing for a while. Unless otherwise indicated, links are open access/not paywalled. (ETA: see comments for some interesting things/extra thoughts that didn't fit in this monster of a post)

I started googling articles about the origins of cities and agriculture due to my disaffection with the opening parts of Karen Armstrong's "Fields of Blood," where she assumes that military violence and structural inequality are necessary ingredients to creating a civilization, and links "civilization" (ie, inequality and warfare) to the invention of agriculture. That didn't sit well with my memory of Catal Huyuk, one of the earliest known cities, 6,000-ish people living in a town on the south coast of Turkey about 9000 years ago. Catal Huyuk had no city wall or other signs of military defences. It also had no temples or palaces, with minimal signs of social inequality. Just a community of thousands of people living in peace, trading obsidian tools for luxury goods from distant communities.

And then more recently I read an article about Gobekli Tepe, the oldest megalithic site yet found. It's a large worship complex in Turkey near the border with Syria that was built starting 11,500 years ago, by people who did not yet have domesticated plants or animals. And that turns the conventional model of the history of cities on its head.

The traditional model of prehistory, which Armstrong hews to, holds that hunter gatherers had extremely low population densities, with bands of 40-ish people occupying a large home range. In places like the fertile crescent, wild precursors to crops provided them with enough food that they could settle down and begin the process of inventing agriculture. That process is supposed to have begun around the end of the ice age, when human-influenced mass extinction of megafauna made big game hunting no longer a viable survival strategy, and when the climate became warmer, wetter, more stable, and generally congenial to the invention of gardening. Farming started out in small communities, and only over time as the total food surplus increased and trade fostered more specialized crafts did hamlets become villages, towns, and cities. But Gobekli Tepe was a fair sized community of (technically) hunter-gatherers existing before plant or animal domestication.

Which made me go pull Jane Jacobs' Economy of Cities off the shelf. Jacobs's book is a pro-urbanist argument that cities are central and primary to the economy of the surrounding lands, contrary to the tendency of urban planners, architects, economists, and other thinkers from the 19th and early 20th century to regard cities as a bad thing, an aberration of capitalist development that should be done away with when creating planned communities. But Jacobs starts the book by arguing that cities existed before agriculture, that it was urban living that created the necessary conditions for the development of agriculture - on the one hand, providing the necessary gathering of minds needed to spur intellectual ferment and technological progress, and on the other, creating a logistical problem of feeding an ever-growing urban population, to which animal husbandry and agriculture were the solutions.

Archaeologists dismissed that part of her book, and urbanists that praise the rest of it tend to glide in embarrassment over the first section. But now we have Gobekli Tepe. A community of hundreds (judging by how much work was required to build the megalithic temples) of settled, non-nomadic people who didn't have domesticated crops or animals.

Exactly how large the community was isn't yet clear - until very recently no houses had been found, and it was thought that people didn't live there full time, but just came together from the surrounding lands to build shrines, worship, and then dispersed to nomadic hunter-gathering again once a shrine was completed. Which makes little sense, of course, but archaeologists are very good at swallowing camels and straining at gnats when those camels and gnats are inconvenient to their preconceived ideas about what early people did and did not do. A few years ago, in the process of constructing a visitor's centre at the site, including a big permanent tent to shelter the excavated megaliths, they finally dug deeply enough to find houses, and the picture changed from a collection of temples in the wilderness to a more sensible one of a town of people who devoted a lot of resources to building worship spaces.

A town, but not really a town of hunter-gatherers, except technically. They didn't have domesticated animals and the bones of animals found in their kitchen waste show they hunted extensively, but Gobekli Tepe is in the middle of the part of the fertile crescent where agriculture was invented, and it flourished in the immediate pre-agricultural era, when people were perfecting their skills at raising and harvesting edible wild plants.

But first we need to backtrack. The myth of "man the hunter" refuses to die. Early people were not hunter gatherers but gatherer hunters - if they are anything at all like modern non-agricultural peoples, meat from hunting was only 20% of their diet. But because bones don't decay like plants, archaeologists see the evidence of hunting first, and evidence of plant gathering second and only if they use microscopes to examine the dirt adhering to stone tools and containers for tiny fragments of seeds. 19th and early 20th century archaeologists washed their finds, which eliminated the clues needed to verify what plants the people who used those tools had been processing. Archaeological techniques have advanced, and an article from 2010 reports that unwashed grinding stones from diverse sites in Europe showed signs that the owners has used them to process grass seeds and cattail roots, as far back as 30,000 years ago. That's deep in the depths of the last ice age.

We also know that ice age humans were harvesting the wild ancestors of modern domesticated crops. At Ohalo II, a cluster of 23,000 year old huts and hearths on the shore of what is now the Sea of Galilee (back then a single body of water encompassing the Dead Sea, Jordan River, and Sea of Galilee called Lake Lisan), stone sickles were found, as well as remains of wild wheat, barley, and other grains (the site burned, then quickly afterward flooded when the level of the sea rose. Charring followed by anaerobic conditions led to excellent preservation of organic material like plant parts. The site was discovered when a drought temporarily lowered the lake level several meters in the late 80's). Furthermore, a genetic analysis of plant parts found in archaeological sites across the Fertile Crescent indicated that humans had started to select against the genes controlling seed dispersal (genes dealing with the shattering of ripe ears of grain) up to 25,000 years ago for emmer wheat and 32,000 years ago for einkorn wheat.

Which throws the entire "agriculture happened when it did because the ice age was over" meme in the trash heap. Contemporaries of the people hunting mammoths were practising agriculture in the wetter fringes of the harsh desert that filled the middle east during the last glacial maximum. During the LGM, roughly between 31,000 and 16,000 years ago, the entire planet was much colder and much dryer than today (check out the vegetation map, and note that "Extreme desert" means less than 2% plant cover - basically like the dry parts of the Sahara desert today).

But, and a big but: that map is rather coarse grained and doesn't allow for rivers or microclimates. The Nile still existed, at a reduced rate of flow, and the upper Nile backed up behind sand dunes to create lakes in the desert where people lived and fished on the shore. The Tigris and Euphrates also still existed. Lower sea levels meant the Persian Gulf was a dry river valley surrounded by harsh desert. While the homes of the people who lived there are now underwater, the sudden appearance of people (paywalled) to either side of the gulf immediately after the sea level rose suggests that humans made their homes in the bottom of the gulf, then were forced to migrate into the less hospitable surrounding deserts when it became flooded. And in Palestine, the shores of Lake Lisan were fertile enough to support humans despite the entire area being "extreme desert" according to the map.

So, agriculture (in the sense of humans planting and harvesting grains like wheat) got its start not in the fertile Holocene after the ice had melted and the planet warmed up, but during the worst, most brutal parts of the last ice age. Forests were a lot sparser and more limited in extent during the ice age, but I don't doubt that where there were forests, humans were creating forest gardens back then as well.

And at this point we're starting to get far enough back in time to tangle with another mental block archaeologists have, and another extremely racist myth that I was peeved to see Karen Armstrong perpetuating: that "cognitively modern humans" got their start much more recently than anatomically modern humans. Something was lacking, the myth says, in Homo Sapiens before 40-50,000 years ago (Armstrong says 20,000, which is even more ridiculous, since we do have widely accepted and extremely well dated examples of "modern" tools, art, etc from well before then), something mental that suddenly changed, "coincidentally" around the time they started spreading out from Africa. Believers in the myth point to the seeming explosion in the diversity of human tools and artefacts around that time. Sceptics patiently dig up examples of "modern" tools from much older sites, and defend their reality as artefacts, their dating, and their interpretation against vehement attack. Eventually, one hopes, the mental block will be discarded for the racist claptrap that it is, the refusal to see early, mostly African humans as qualified to be counted as "human" in the same way as their descendants who left Africa.

This is related to the mental block archaeologists have against admitting that early humans could produce art, such that any obvious examples of art found that date before a certain time frame get ignored, attacked as not actually made by humans, or attacked as not actually as old as proponents think. Which brings me to another book (last one, I promise!), "Lost Civilizations of the Stone Age" in which Richard Rudgley doesn't address the myth of cognitive modernity directly, but attacks it by attacking the refusal to give legitimacy to artefacts and art that are deemed "too old to actually be art/advanced technology."

So, when did agriculture get its start? It's very hard to say, especially since many of the best sites for gardening during the ice age are now under the ocean. But clearly, it wasn't 12,000 years ago, and it wasn't in response to the extinction of megafauna or the moderating of climate as the glaciers melted.

Even if we restrict ourselves to people who had domesticated plants (the last 12,000 years), at least half of the age of agriculture happened before humans invented inequality, wealth hoarding, and empire building, the violence that Armstrong sees as inseparable from civilization. If we include the long period of gardening plants as they gradually became domesticated (and Gobekli Tepe shows that such gardening was productive enough to support towns of at least a few hundred people engaged in major construction projects), then the age of inequality and brutality that Armstrong equates to civilization has been around for only a quarter of the time in which humans had the ability to build towns and have food surpluses.

Inequality isn't inextricable from civilization, it's just one approach that has violently exterminated all the other approaches it came in contact with throughout history.
glaurung: (Default)
Famously, the Incas did not use mortar in assembling their stone buildings, which were so perfectly carved and set that there was essentially no gap between the blocks. Each hand-carved stone block was made to mirror the concavities of the block below with convexities of its own, so that once laid, the blocks interlocked, with each course resting in hollows in the course below - this made the walls extremely durable in the face of earthquakes. The stone masons took irregular rocks and made them fit together perfectly without doing away with the irregular shapes, resulting in striking patterns in the stone. (OTOH, they also made walls with rectilinear blocks; I think the thing was they used whatever rock was at hand. Loose rocks cleared from the construction site produced irregular shaped blocks, while quarried stone produced rectilinear blocks)

The outside faces of the walls were chiselled and sanded smooth for aesthetics, especially on more important buildings and high status homes. The inside faces were usually not cut or polished smooth, and there was not as much care to make the gaps between blocks as tiny as possible, except in the very highest status buildings.

But, I had trouble finding imagery that contrasted the pretty vs not so pretty sides of a stone wall, so I'm not sure just how much the inside walls were different from the outside walls. Decorative bas relief carvings were sometimes executed on the blocks. The Incan stonemasons were willing to work with whatever sized rocks were handy, from small blocks to megalithic scale boulders.

Looking for information about Incan stonework led me to a fascinating article that tries to untangle just what it was that the Incas used instead of mortar. We know from early Spanish accounts that they did use something, which Spanish eyewitness chroniclers did not properly understand but tried to describe anyway. Some said the Inca stone workers used a reddish mud. Others, that they poured molten gold and silver into the cracks between the stones. The Incas themselves said the stones were made to flow into place. The article assembles all these tidbits and suggests that the stonemasons were using mud from copper and tin mine tailings, in which sulfer-metabolizing bacteria would eat iron pyrite, producing extremely strong sulphuric acid. Chemical reactions between the acidic mud and the rocks would have given off steam, and that plus the glitter of fool's gold might have confused the chroniclers who wrote about molten metal. Acidic mud, the author argues, helped to temporarily soften the stone of the blocks, encouraging a more perfect, gap-free fit as the softened stone would flow ever so slightly as it re-hardened. Close examination of the joins between stones today shows a tiny discolouration that suggests such a chemical reaction took place. It's fascinating, and worth a read if you have the time. Otherwise, enjoy photos of incan masonry here.
glaurung: (Default)
I was reading about Machu Picchu and Inca stonework last week, and came across the hoary old colonialist talking point that the Incas did not invent the wheel. Being in a "go down internet rabbit holes" mood, I found myself reading various explanations for why precolumbian Americans, despite having wheeled figurines (see: image here and writeup here and early prototype pottery wheels (the kabal/molde, never scaled those figurines up or broadened their application to transport. Very few of the explanations sat well with me.

Read more... )Central American pottery wheels were still at the "spin the pot relatively slowly while laying coil" stage when the Spanish arrived. Not all the pieces had come together yet, and thanks to the conquest, they never would.

(extra bits that I found while writing this but that didn't fit are in comments)
glaurung: (Default)
Two books that have been on my mind recently.

I have slowly been working my way through Karen Armstrong's Fields of Blood, which is about the degree to which religions advocate for or support violence. It's basically an in depth "it's more complicated" rebuttal to the Islamophobic claim that Islam is a warlike religion, as well as to the atheist talking point that Christianity has been the cause of innumerable wars through the centuries.

Armstrong starts by noting that civilization, as traditionally defined, is founded in violence - the expropriation of food and labour from the poor by the ruling elite, on the one hand, and the destruction of the poor by the elite's soldiers in wars of conquest, on the other. Armstrong's expertise is the history of the major religions of Eurasia, but I found myself arguing with the book quite a lot in the early chapters where she relied on some out of date archaeology to talk about the origins of agriculture and cities.

Cities and civilization did not have to be founded on violence - we have the peaceful, undefended ruins of Catal Huyuk, an ancient city built without fortifications or other defences against attack, which also seemed not to have a ruling class of haves who stood above the have-nots. Kingdoms and empires, and the wars they engaged in, dominate traditional world history lessons because the warmongers conquered everyone else and got to write the histories. But they're not the only way our ancestors did civilization.

Recently I took a break from Armstrong to re-read another book on religion and war: Barbara Ehrenreich's Blood Rites. Ehrenreich is a journalist rather than a scholar, and as a historian I found things to disagree with all the way through her book, but the core thesis seems pretty solid:

1. War is an ancient activity, but not universal - it seems to be a cultural disease, a meme that infects cultures. Once one group arms itself and attacks its neighbours, those neighbours have to follow suit or be destroyed. And thus the war disease spreads through the world. Always, however, humans seem to talk and think about war in religious terms, especially calling the death of soldiers "sacrifice."

2. Sacrifice, in turn, while not much practised by modern monotheistic religions, was the bedrock of all worship, including monotheistic worship, throughout the ancient world, and (often in a tamed and vegetarian form), still is the bedrock of polytheistic worship everywhere. The gods demand not prayers but fresh blood and meat. They are envisioned to actually be feeding on the meat that is burnt upon their altars. Gods are carnivores.

3. The gods of the ancient world were not humanity's friends, but rather dangerous and cruel beings with violent and destructive inclinations. Gods do not protect humans from natural disasters, they are the cause of those disasters. Worship and sacrifice is all about appeasing them and preventing them from destroying humanity.

4. Humans are prey animals: our distant ancestors often ended up as food for leopards, and even in the era of agriculture and civilizations, big cats continued to hunt and kill farmers and shepherds as well as their livestock, until humans exterminated enough of them that the threat became rare. (Ehrenreich speculates that the architecture of ancient settlements that had holes in the roof accessed by ladders instead of doors - eg, ancestral pueblo cultures in the US Southwest and the inhabitants of Catal Huyuk - was about reducing the threat of predators coming into people's houses at night).

5. Our gods are predators, and the original sacrifice to them, before domesticated animals were a thing, was human sacrifice. Whatever it might have become since then, religion started out as an attempt to prevent disasters (including predation) by appeasing predator-gods with gifts of fresh meat.

6. Naturally the best human sacrifices are people who don't belong to the community making the sacrifice. And thus, war began, in the distant past, as a religious activity, as raiding parties that enabled one group of hunter-gatherers to appease their gods with the blood of members of another group of hunter-gatherers. While everything else about war has changed beyond recognition, the religious language used for it, and quasi-religious way of thinking about it, remain unchanged from its roots as a religious practice, raiding one's neighbours in order to feed one's gods.

Religion has of course since then become a lot of other things: returning to the themes of Armstrong, the core of most of today's major Eurasian religions (Armstrong sadly has never written about precolonial religions of Africa or the Americas, this is her great failing) is a quest to ameliorate human suffering, to fight against the grim truth that life is full of suffering and ends in death.

All through the founding documents of Judaism, Christianity, Islam, Buddhism, etc, are exhortations for people to help the less fortunate, be kind to their fellow humans, and to treat others as they would like to be treated. And those exhortations coexist uneasily with passages that depict the gods who are supposedly asking us to be nice to each other as violent, abusive, capricious beings that would just as soon destroy us if we don't feed them plentifully and regularly with sacrificial meat.

There's a huge gulf between religious institutions, which, in cahoots with the rich and powerful, seek to suppress dissent and keep the common people from disrupting the rapacity of elites; and spiritual movements, which have always been about helping the downtrodden and demanding that the rich share their bounties with those who have nothing.

Ehrenreich's book has a lot about warmongering elites who delight in war. Some of it falls afoul of her lack of expertise in the history she's covering, but I think it's interesting that while sacrificial religion seems to predate agriculture and the creation of "civilizations" which divide people into elites and commoners, the main proponents and perpetuators of war-based worship over the past 12,000 years have been those elites, while the main proponents of being nice to each other have been common people.

The elites of the ancient world delighted in being "hunters" - of literal animals, including big cats who prey upon farmers, and of their fellow humans, through war. Instead of worshipping predators, they became them. Blood sacrifice to capricious gods became monetary sacrifice to overlords who held all the wealth and power in society.
glaurung: (Default)
Another thinky thought post brought on by a video seeking to answer the question why Europeans enslaved Africans specifically.

And while the video didn't contain any misinformation, it felt a bit incomplete because I've recently read David Graeber's "Debt the first 5000 years," and because of a recent post on the Collection of Unmitigated Pedantry Blog which talked about slavery in the process of critiquing a world conquest strategy video game

Read more... )Lastly, have a table from Debt, showing just how miserably poor Europe was compared to essentially everyone else in Eurasia, even when comparing them to nations from centuries or millennia previous. Crappy climate > low agricultural productivity > low population densities > few and small cities > economic backwater.

Copying just one column of data from a table showing population and tax revenue for several ancient and early medieval nations:

Persia 350 BCE, 41 grams of silver per person per year
Egypt 200 BCE, 55 grams,
Rome 1 CE 17 grams,
Rome 150 CE 21 grams,
Byzantium 850 CE 15 grams,
Abbasids, 850 CE 48 grams,
T'ang, 850 CE 43 grams
France 1221 2.4 grams
England 1203 4.6 grams
glaurung: (Default)
Blog post 1: The Unmitigated Pedantry blog mentioned in passing today that while in medieval Europe, fortifications were built with thin stone walls which were very easy to destroy with the early, crude cannons of the 1400's. In China, on the other hand, fortresses were built with thick earthen walls lined with a thin layer of bricks, which were immune to early cannons.

Both places had access to the same kind of early artillery technology at roughly the same time, but in China, cannons were seen as a novelty of not much use. In Europe, the earliest, crudest cannons were a game changer, enabling the conquest of forts and cities without long sieges, leading to massive shifts in power as those who could afford cannons conquered their smaller, poorer neighbours, until the only nations left standing a few centuries later were countries that could afford the massive expense not just of cannons, but of building lots of all-new cannon-proof fortifications to defend their territories.

And this military transformation within Europe fed into other interacting factors to transform Western Europe from a poor backwater that was decidedly weaker than the vastly larger, more populous and far richer nations of Central and Eastern Asia, into a colossus of conquest that took over the entire world in the 18th and 19th centuries.

The question that the Pedantry blog did not address was why China built their forts so differently than Europe.

Which brings me to another blog post from last year: The Analog Antiquarian has been posting multpart essays about the 7 Wonders of the Ancient World for a while now. Sadly he is not a historian and sometimes uses old and outdated books as his sources, and I have found his novelistic approach sometimes offputting. But one thing I learned from his series a while back: archaeologists have never been able to find the Hanging Gardens of Babylon, of ancient clickbait fame ("You'll never guess what building is number six on our list of the 7 most awesome structures worth seeing in the world!").

The Hanging Gardens of Babylon seem to have never actually existed in Babylon. Although there are scholars who think something like what was described in the ancient lists did in fact exist in Nineveh. Although this just trades one mystery (where is it) for another (why did so many writers of the ancient world mix up two very distinct cities?)

But in the process of explaining the non-discovery of the Hanging Gardens by modern archaeologists, the Analog Antiquarian highlighted something I had already sort-of known: that ancient Babylon left behind very few ruins, because of its location. In the middle of a vast floodplain, quite far from any hills or mountains, with nothing but silt beneath their feet as far as they could dig, ancient Babylonians built everything, from hovels to palaces, out of mud brick. Which over the millennia, has completely eroded away into subtle mounds on the landscape, plus, sometimes, ceramic tiles that once decorated the outer layers of the walls of more elaborate buildings.

For instance, we have today a reconstruction of the Ishtar Gate. The wood of the gate rotted away, and the mud brick of the walls that flanked it eroded to nothing, leaving only the ceramic tiles which adorned it and made it splendid enough to get on the original lists of World Wonders (until a later revision bumped Babylon's walls and gates to make room for the Lighthouse of Alexandria). German archaeologists dug up the tiles of the gate in the 30's, took them home, and reconstructed the gateway: today you can see it in Berlin's Pergamon Museum (Nazi funding meets colonial archaeology, sigh).

Thinky thoughts produced: China, like Babylon, is a civilization centred on floodplains (the Yellow and Yangtze rivers), where stone has to be imported and the easiest and cheapest way to build fortifications is with earth. And naturally when China's rulers expanded beyond the floodplains, they stuck to known and familiar technology, continuing to build fortifications with thick earthen walls even when stone was available. So they never had the kind of thin masonry walls that primitive cannon were useful against.

Whereas the nations of Europe are mostly not centred on vast floodplains where stone is hard to come by. Stone was the first thing they reached for when they needed to build a fireproof fortification, until cannons made such walls obsolete.
glaurung: (Default)
This post brought to you by my brain refusing to stop chewing on a bit of esoterica that no one outside of Apple pundits gives a flying fuck about.

In the mid 90's, Apple had a near-death experience. The number of people willing to pay a heavy premium for a special nonstandard computer when you could get a much cheaper standard Windows 95 computer that did 95% of what the Mac could do, was plummeting. And yet the company continued to churn out a huge swath of different models of computers, as well as printers, proto PDAs called Newtons, and so on. Losses mounted and Wired published a cover story about the imminent death of the company.

Then in the late 90's Steve Jobs returned to Apple and amputated big chunks of the company, cancelling numerous projects and product lines. He condensed the company's output down to exactly four Mac models, expressed by a famous (in Apple circles at least) graphic:

Grid-of-4

(ID: four computers in a grid. At the top, labels "consumer" and "professional" and along the side, labels "desktop" and "portable", with a blue CRT imac, a blue powermac, three of the colourful imac laptops, and a dark grey mac laptop)

Now almost immediately this grid acquired some footnotes - the laptops and Imacs started coming in different screen sizes, and once you'd chosen a size you had to choose among low/medium/high end specifications for the processor, etc. But for most of the oughts, Apple made exactly four kinds of macs and it was very easy to tell which one met your needs.

By the end of the oughts, the grid had expanded, without anyone ever actually saying anything about it. The new, unspoken Macintosh product grid looked like this (image thrown together quickly with a meme generator because I was lazy, forgive the small size and low quality)

grid of 6

(ID: six computers in a grid. Columns labeled "consumer | professional | tiny" across the top. Images of an IMac, a Mac Pro, a mac mini in the first row, and a plastic macbook, a macbook pro, and the old rounded corner macbook air in the second row)

It had taken Apple a couple of false starts to get there (the powermac cube, the 12" powerbook), but by the end of the decade they had expanded into a new product category: tiny computers. For a while, the mac mini was the smallest desktop it was possible to buy. For a while, the Macbook Air was the only ultralight laptop with an almost fast enough dual core processor and a full size keyboard.

Ten years later, the grid of six became a grid of five. The entire laptop market had glommed onto thin and light, and the niche, expensive Macbook Air had become Apple's best selling Mac, their new mainstream base model laptop. Apple's Mac Mini hadn't changed in size much, but it was no longer the smallest desktop. The Macbook Air had competitors who were even lighter weight. Finally, Apple's "Consumer" desktops had become powerful enough that lots of professionals were using them, and many consumers were buying the more expensive "pro" laptops. So the categories need a renaming. Instead of consumer and professional, let's call them "mainstream" and "high end", with "small" a better descriptor today than "tiny".

grid of 6 2020

(ID: a grid of six, this time labeled "mainstream, high end, small" across the top. Imac, Mac Pro, and Mac Mini in the top row, and Macbook Air, Macbook Pro, and a question mark in the bottom row)

The complications and footnotes with this grid are all in the laptop category. While the larger and more expensive Macbook pro has always remained solidly in the "high end", the Macbook air has jumped from "small" to "mainstream" in that it's now the default laptop that most people buy. And the smaller size Macbook Pro is suffering a bit of an identity crisis, as it's split into low end and high end models, distinguished by the number of ports they have. The low end "pro" model seems better suited to being the "mainstream" choice and letting the Air go back to being Apple's "small" laptop.
glaurung: (Default)
(note: A lot of this is inspired by the Collection of Unmitigated Pedantry blog's series last fall on premodern subsistence farming, and especially their addendum on rice)

Premodern subsistence farmers organized their farms not around maximizing yield, but around doing all they could to ensure their family did not starve. Extended families of 8-ish people would farm just a couple hectares of land. Over time large farms would shrink as siblings each took their share of an inheritance, until they hit the minimum size farm needed to produce enough food for everyone. This is true of all premodern farmers, from China to Europe. Everything was organized around guaranteeing, as much as possible, that no one in the family would starve between one harvest and the next. Each family would would work several small fields scattered around the village where they lived, with each field in a different terrain with a different microclimate. If one area around the village had too much or too little water in a year, or if one hillside was blighted by disease or pests, everyone in the village would be a little worse off but no families would face complete destruction. Reducing the risk of starvation was the main priority, not producing a surplus of food to feed to non-farmers.

Staying alive was a community effort. If one farm was pillaged (legally by the aristocracy or illegally by bandits), suffered a sudden death, or had an unusually bad harvest despite its scattered fields, other families in the village would help out. It's common to talk about this in capitalist terms, but that projects modern economic concepts of money and debt onto a past that was not capitalist but communalist. Money debt and a market economy existed, of course, but it was imposed on the farmers and the village community from above by the wealthy and by the towns and cities that sold vital speciality goods to the farmers nearby.

All of that is universal regardless of what the farmers are growing. But the requirements of rice and wheat farming produced vastly different social systems and vastly different societies. Read more... )

In the rice belt of Asia, farmers did not have to pay their lords a fee in order to keep their families fed. They were self-sufficient in a way their wheat-raising counterparts were not and could not be. And at the same time their communities engaged in multigenerational projects to create more farmable land, projects that were simply impossible for their wheat-raising counterprarts in Europe. I don't know enough about the history of China and other rice-based nations to say much about the impact this had on the very different histories of the two regions, but it does give food for thought.

One last thing: traditional farming in Europe and America is all but extinct. Essentially no one still grows wheat in order to eat it themselves, and all farmers, even the Amish, are more concerned with producing crops to sell than with feeding themselves. Farming families work far more than two hectares, and they don't worry about divvying up fields into small bits to minimize microclimate failure. In the rice belt, on the other hand, modern farmers still farm in the traditional way. They use fertilizer and high yield breeds of rice which let them produce a large surplus to sell, but the essential system - of small fields created with vast amounts of labour, flooded and farmed with even more labour - remains the same.
glaurung: (Default)
I bought a G4 Mac Mini, because I thought it might be fun to mess around with classic mac software someday, and I didn't want to get something large that would take up a lot of room, so that meant getting a mini.

Naturally I had to upgrade the spinning hard disk in it, because hard drives suck. After much searching I discovered you can buy mSATA to 2.5" IDE adapters fairly cheaply, which enable you to put a fast SSD into a 2.5" laptop sized IDE case. I got one, got an mSATA drive, and was all set.

I performed the upgrade... and could not get the computer to boot from a CD. I tried cloning the existing OS to the new drive, and could not get the cloned drive to boot. Key combinations that were supposed to force a mac to boot from optical disk failed to work. After weeks of banging my head against this wall, I finally realized that the original drive in the Mini was set as a secondary IDE drive, rather than the default primary drive. Fortunately the adapter had pins for a jumper. I took the jumper off the old drive, put it on the new drive, reassembled the Mini for the 6th or so time, and... it worked perfectly.

Long ago, in the early oughts, I knew about IDE drives and jumper settings, and I knew that you could only have one drive set as primary at a time. But I had utterly forgotten about all that crap in the intervening decade. It doesn't help that most IDE laptops (and the mini is just a headless laptop) had two channels, one for the CD and one for the hard drive, so you didn't have to think about primary/secondary. But Apple made the Mini using the cheapest, most minimal possible combination of parts, which means one channel. Since the optical drive has no jumpers and is always set as primary, the hard disk has to be set as secondary.

None of the instructions I used, neither Apple's tech repair manual nor Ifixit, mentioned that you have to set the new IDE drive to be secondary. I looked online and found exactly zero of the top hits for Mac Mini G4 upgrade mentioned jumpers at all.
glaurung: (Default)
The Verge has a navel-gazing article about netbooks, those tiny, cheap laptops that were incredibly popular for a few years in the late oughts and then vanished utterly by the early teens. And by navel-gazing I mean that the author interviewed a few of his journalist friends, none of whom knew any more about the reason netbooks were popular than he did, and then wrote an article displaying his profound ignorance.

It's very simple, for those who are not narcissistic tech journalists: Netbooks were two things that both had a significant market, at a time when there was no other way to take the internet with you other than carrying around a laptop. 1. They were tiny. At a time when a regular laptop weighed five pounds and a big screen laptop six pounds, netbooks were just one kilogram. A netbook would fit in any old shoulder bag with lots of room for other stuff; a notebook required its own dedicated bag. #2, Netbooks were cheap. At a time when the cheapest full size laptops cost $600, and a decent thinkpad cost $1000, a netbook could be had for less than $300.

Size of course was a huge selling point. At the time, the only viable way to access your email and read the latest doings of your friends on Myspace and Livejournal was with a laptop. A tiny laptop that didn't need its own bag and wouldn't take up the entire surface of your table at Starbucks was vastly preferable, even if it was molasses slow and had a keyboard made for hobbit-sized hands. And of course, tech journalists and other professionals who needed to travel a great deal were always looking for a notebook that was smaller and lighter, so they wouldn't need such a heavy carryon bag. Some of them were even willing to put up with a crappy undersized keyboard to get that lighter carryon. Ultralight laptops had existed for a long time, but they cost a lot more than a standard laptop, and were hard to justify on a journalist's salary.

Cost was also a huge selling point. A $300 laptop made owning any kind of computer possible for the first time for a huge number of low income people all over the world who would otherwise never have been able to afford one. People who might as well be utterly invisible as far as narcissistic tech pundits are concerned.

Then in 2010 Apple came out with Ipads, on the one hand, and with Mark II of the Macbook Air on the other. And within a few years the entire technology industry followed in their footsteps as usual. Full sized but thin and ultralight laptops came down in price to $1000 or less, and siphoned off from the netbook market all of the professionals and writers who were looking for affordable-to-them small and light writing machines. Tablets and smartphones siphoned off all the people looking for devices to provide internet access which you could carry with you. Meanwhile, laptop makers started making full size laptops lighter and lighter, and selling them for less and less money, until the netbooks were left with no one willing to buy them.
glaurung: (Default)
I finally checked out some of the DC animated movies. The good ones were quite good. Sadly, neither of the Wonder Woman animated movies released to date were in that category.

Wonder Woman (2009) has a nonsensical villain - Ares is a god, and gods crave worship, so why the heck does he want to exterminate humanity and thus deny him any worshippers? But the real problem is that it oozes frat boy sexism, from a slimy Steve Trevor to an Amazon who turns against her people because she was denied the opportunity to marry and have children (bleah). And Trevor saves Paradise island from being destroyed, because behind every powerful woman there has to be a slimy man without whom the day would not have been saved. 🤬 Best avoided.

Wonder Woman: Bloodlines, on the other hand, is an incoherent mess of a film. The only explanation I can come up with is that DC had the script for a TV miniseries, but then a wild goat got loose, scattered the pages, and ate half of them at random. And rather than print out a new copy, they decided to just film the remaining half of the pages and call it a movie. It feels like a much longer story with all of the connective tissue removed. It never once stops to explain motivations, give characters a second to be themselves, or make much sense at all. Again, best avoided.

On the other hand... All Star Superman is a marvellous film that is just as good as the comic book miniseries it adapts. That's right, a film version that is fully faithful to the original material: something I would never have expected was possible from Warner Brothers.

Finally, Batman Soul of the Dragon is a fun love letter to 70's martial arts movies. Thankfully it makes Richard Dragon (DC's version of Iron Fist/white guy who becomes the world's best kung fu master), into an Asian man, thus reducing, slightly, the orientalist and racist nature of the material it's reworking. If it was all about Batman I'd not like it nearly as much, but he is just one of an ensemble cast of martial artists (and he is the only white person among them) who have to band together to defeat the bad guys. Plus, instead of being the best, Bruce Wayne is explicitly called out as the least skilled among them, which was I thought a nice touch.
glaurung: (Default)
About Joss Whedon and the shows he created... and about other creators who have done deplorable things.

There's a line in Lovecraft Country one of the characters speaks about their love of the racist ERB Warlord of Mars stories: “Stories are like people. Loving them doesn’t make them perfect. You just try and cherish them and overlook their flaws.”

But that's not quite right. You don't overlook their flaws. You love them despite of them... or else the flaws become too huge and you have to break off the relationship because of them.

And both reactions are fine. Art exists both in and out of the author's shadow. There's a part of it that gets tarnished by the author's attitudes and behaviours. And there's a part of it that exists beyond that. It's possible to have your enjoyment diminished by the tarnished bits and still adore the rest. It's possible to not be able to enjoy anything about the work anymore because the tarnish has eclipsed everything else for you.

Different people are going to have different responses, and that's fine. As long as we aren't denying or minimizing the gravity of the evils committed by the artist, there's nothing wrong with boycotting somebody's books forevermore, and there's also nothing wrong with continuing to read and enjoy their work for the untarnished bits.

(steps down from soapbox)
glaurung: (Default)
So, I came across a post about queer themes in Wonder Woman, Wonder Woman's war-era sidekick Etta Candy, and Dr Wertham. Which was so riddled with errors that I just had to write a post of my own (because comments were not enough).

Wonder Woman started out as feminist propaganda. Kinky, queer, bondage-obsessed, with a very different from modern ideas 19th century kind of feminism (women are not equal but different from men and women should be in charge because they will do a better job), but nonetheless, feminist propaganda. The queer kinkiness was filtered and coded of course by being published in comic books for children in the 40's, but it was still undeniably there.

Wonder Woman's sidekicks and Diana Prince's friends were Etta Candy and the girls of Beeta Lambda sorority at Holliday college. They were part of that propaganda message - promoting women's colleges, women's education and independence, and the idea that any woman can be a heroine like Wonder Woman if she puts her mind to it. Etta and her girls were also (coded, filtered) gay or bi characters, modelled on women that Marston's bisexual partners had known in the women's colleges they attended and the women's college sororities they had belonged to.

However, Etta was never Wonder Woman or Diana's girlfriend, even in subtext. From day one, the Wonder Woman comic adopted a genderswapped version of the Superman-Lois Lane dynamic, with Diana infatuated with Steve Trevor, who was infatuated with Wonder Woman.

Marston and his female partners co-created Wonder Woman and co-wrote each story, but sold them under Marston's name. When Marston died, the editors at DC refused to hire his uncredited women co-writers, and instead handed the comic over to Robert Kanigher, a typically sexist man who had no truck with all this feminist stuff.

Kanigher jettisoned the feminist messages that had appeared in every story, jettisoned most of the bondage themes, and jettisoned Etta and her sorority sisters. He kept (and enhanced) the eclectic, magic-meets-sf-meets-mythology-meets-fairy tales setting of Paradise Island, and kept the Diana-Steve-Wonder Woman love triangle. Because the love triangle was boring as fuck, he set a lot of his stories on Paradise Island. Without Etta and company, and without queer women co-writing behind the scenes, the comic became completely heterosexual, despite being often set on an island populated only by women.

Fast forward to 1953, when psychiatrist Fredric Wertham published a screed against violence and sexuality in comic books (expanded into a book the following year), which he felt were the root cause of juvenile delinquency and of the sexual irregularities of his child patients. Wertham's primary targets were crime and horror comics, but he did devote a little space to superhero comics like Batman ("a wish dream of two homosexuals living together") and Wonder Woman ("for boys... a frightening image. For girls... a morbid ideal"). Wertham's book states that it's based on seven years of research, which might explain why he called out the Holliday girls in Wonder Woman, as "gay party girls, gay girls" - despite the fact that Holliday college had been dropped from the comics for six years by the time his book was published.

Wertham was successful in virtually exterminating crime and horror comics, but he didn't actually have all that much effect on superhero comics - Bruce Wayne and Dick Grayson continued sleeping in twin beds in the same room together long after Wertham, and in the case of Wonder Woman, the censoring of gay themes had already been done several years before he came along.

Sources: Seduction of the Innocent, The Secret History of Wonder Woman (both on my shelf), various comic nerd web sites, and my own personal knowledge from having read tons of Wonder Woman comics, including reprints of dozens from the war years and a few from the post-war, post-Marston era.
glaurung: (Default)
I realized there was another list of GWG movies (in a paper book on HK cinema, rather than a website), and I found a few that I hadn't seen yet. Only the first of these is really any good.

Princess Madam (1989). Read more... )

Mission of Justice (1992) Read more... )

A serious shock! Yes Madam! (1993). Read more... )

The Avenging Quartet (1993) Read more... )

Madam City Hunter (1993). Read more... )
glaurung: (Default)
I see so many people on Facebook buying awful laptops, or asking for help with choosing a new laptop.

Laptop makers produce two very different kinds of laptops. They have lines of laptops intended for consumers, which are made to look pretty and cost as little as possible. Consumer laptops are *not durable* - they're made to outlive a one year warranty, and no longer. They have tons of promotional crapware preloaded on them - software that the manufacturer is paid to put on the computer (not the other way round), which is poor quality and makes money for the developer through ads, popups demanding that you upgrade, or actual user tracking and spying. And they are *not* designed to be easy to repair or upgrade.

And then there are lines of laptops intended for businesses and corporations. Which are usually not loaded with crapware (or not as much), which are designed to outlive a three or four year corporate replacement schedule, and which are often quite durable. And they can be easily repaired, because the corporate buyer gets them with a multi-year maintenance contract on them. They cost more, often significantly more. But they are worth it - a good corporate-grade laptop will last until it becomes obsolete, and will be more likely to survive accidents.

Understandably, most people are reluctant to pay over a thousand dollars for a laptop when they can get one for less than five hundred. But the best part about corporate laptops is that the companies that buy them replace them long before they cease to be useful, so there are tons of "off lease" laptops available for about the same or only a little more than a new consumer laptop from Best Buy. And despite not having a manufactuerer's warranty, those used business class laptops are a far better value.

And now, because I had to hunt this information down: For the top five manufacturers, the lines of laptop that are business class instead of consumer class are:

Lenovo Thinkpad, especially the T (general) and X (ultralight) series.
HP Probook (general business) or Elitebook (high end workstations)
Dell Latitude (general) or Precision (high end)
Acer Travelmate
AsusPro.

I prefer Thinkpads, but Dell and HP also make good notebooks. I've no experience with Acer or Asus.
glaurung: (Default)
I have read Pursuit of the Pankera so you don't have to. The short version: The first third is exactly the same as NOTB, then becomes a much better book for the middle half, before becoming a much worse book in the last several chapters.

This is not a review so much as a comparison, and I am spoiling both books very thoroughly.

first some context )

and now the discussion )

To sum up, Pursuit of the Pankera fails to work as anything other than a poorly developed frightened white man's "aliens as stand ins for scary brown people/yellow peril/urban blacks" trope. You are better off not reading it unless you want to see the Mary Sue-ish fanfiction in the middle section.
glaurung: (Default)
Some recent mainstream movies I have seen, some good, some bad.

Star Wars 9: Rise of Skywalker. Read more... )

Terminator: dark Fate. Read more... )

Once Upon a time in Hollywood. Read more... )

Charlies Angels Read more... )

Birds of Prey. Read more... )

Knives Out Read more... )
glaurung: (Default)
Ok, this final batch of films are all set in the 30's with no fantasy elements. The first two are set in Korea and pull out all the stops in depicting Japanese characters as greedy, cruel bullies and rapists. Be warned, massively negative ethnic stereotyping fills both of them.

*Hapkido (1972)Read more... )

*When Taekwondo Strikes (1973)Read more... )

My Young Auntie (1981)Read more... )
glaurung: (Default)
This was the first feminist anime I downloaded and watched. I did a writeup back in 2016 for an email list I'm on, but I forgot to post it here.

Read more... )

All in all, it's good, and I am glad I watched it, but I will not ever be re-watching it. Semi-recommended.

Profile

glaurung: (Default)
glaurung_quena

June 2025

S M T W T F S
1234567
891011121314
15161718192021
22232425262728
29 30     

Most Popular Tags

Syndicate

RSS Atom

Style Credit

Expand Cut Tags

No cut tags