glaurung: (Default)
Yes, I know this thinky is not about snails/underwear either. I'll get to that, I promise.

You remember the story that was all over the news last week about how a comet/asteroid had blown up 3500 years ago over Tell el-Hamman on the shore of the Dead Sea, destroying the city, and this was covered as the source of the biblical legend of Sodom?

Well, buckle your seatbelts, it's going to get bumpy. The article in question, "A Tunguska sized airburst destroyed Tall el-Hammam a Middle Bronze Age city in the Jordan Valley near the Dead Sea" was published in Nature Scientific Reports (*not* Nature itself, as was often misreported). Nature Scientific Reports is Nature's far less prestigious open access counterpart. The article says it's based on fifteen years of annual excavations, which in the world of archaeology is quite a lot: someone must be very well funded.

The owner of the Slacktivist blog noticed this line in the opening section of the article: (the excavation project is) "under the aegis of the School of Archaeology, Veritas International University, Santa Ana, CA, and the College of Archaeology, Trinity Southwest University, Albuquerque, NM." Both of those schools are bible colleges. Trinity Southwest is proudly unaccredited. Veritas International is accredited by the Transnational Association of Christian Colleges and Schools (ie, an association of bible schools that wanted to be able to say they were accredited).

Trinity started out as an in-person seminary in Oklahoma in the 80's. After moving to New Mexico and becoming a distance learning school, they affiliated with an unnamed overseas "internationally-known Bible college and seminary," then declared themselves a university in the early 90's. Although primarily doing distance learning, they say they offer in person classes as well. However, they have no campus: what physical locations they have are scattered along Journal Center Boulevard in Albuquerque. Their website is pretty minimal. Their library doesn't get a dedicated section on the website and just barely a mention in the student catalog that it exists. They do have their own press, though. And, they offer tours of the Holy Land for $4000 to see the sights, or $5000 if you also want to visit their archaeological dig at Sodom. Stuff they dig up at "Sodom" (aka Tell el-Hamman) goes on display in their very own archaeological museum in Albuquerque.

Veritas was founded as a seminary in 2008, and only decided to call themselves a university in 2017. Veritas's campus, as best I can tell from perusing the catalog, is one building, and their library has all of 4,000 paper books (unsurprisingly, the website gives a lot more emphasis to their digital resources). Library users are asked to bring their own computers, so the library doesn't really have terminals. They offer several doctor of divinity degrees, but just one PhD program: in biblical archaeology.

The websites of both of these schools are at pains to put their best face on and pretend that they are real institutions worthy of the name university despite not actually being anything like that. Trinity's campus is scattered, but there is no map in the student catalog or anywhere on the website showing where things are. While I think they don't actually offer in person classes except in a very minimal way, they do claim to have several physical resources and in person classrooms, and since those are not all in one place, there needs to be a map. But posting it to their site would be too much an admission of just how small and inconsequential they are. In the same vein, their catalog doesn't seem to differentiate between distance learning courses and actual in person classes. I think it's safe to say that they don't have many full time faculty other, perhaps, than a gaggle of distant adjuncts tasked with interacting with the distant learners who have been paying $250-ish per credit hour to support fifteen years of excavations in "Sodom."

Neither school provides basic academic information like the number of instructors or the number of students anywhere I could find. Also, unsurprisingly, neither school's website has a single word to say about COVID that I saw.

But, wait: there is more. The Tell el-Hamman paper has a very long list of authors. Only the last, Phillip J. Silvia, works at Trinity Southwest; all but one of the rest are affiliated with real universities, or else with real research laboratories. What's up with that? Turns out there's a second fly-by-night organization here, the "Comet Research Group." They get called out in the paper's acknowledgements for funding the research (as opposed to the excavations) behind the paper, and Allen West, the second to last author of the paper, is one of the CRG's founders.

The CRG is all about finding evidence that ancient comet/asteroid impacts caused local or global catastrophes. Before the Tell el-Hamman paper, they made a splash a few years ago with a proposal that the Younger Dryas, a thousand year cold snap that happened right at the end of the last ice age, was caused not by a shutdown of the Gulf Stream in the North Atlantic, as is usually thought by ancient climate researchers, but by the impact on Earth of a swarm of cometary fragments, Shoemaker–Levy 9-style, that caused widespread destruction, a nuclear winter that extended the ice age by a thousand years, mass extinctions of megafauna, population collapse of early humans in the Americas, and so forth.

The Younger Dryas impact hypothesis has been taken seriously by climate and ice age researchers who investigated its claims and found them mostly without merit. The only thing that seems to have come out of it is that there's a spike in platinum residue in sediment layers that are deemed to mark the start of the YD, in some locations, and that platinum could point to an asteroid or comet impact happening somewhere near the start of the YD... but the layers being pointed to as marking the start of the YD are not all the same age, there is no agreement that these identified layers are all in fact indicators of the start of the YD, the signs being found in those layers that are supposedly evidence of an impact, other than platinum, are all very debatable, and so on.

What's odd is that the CRG has not responded to the criticism and critique of their hypothesis by the scientific community in the usual way (going back to the drawing board, trying to find new evidence, pruning away some of the more extreme claims in their hypothesis and saying surely we can agree on this part, etc), but rather by refusing to share their samples and data with people who they deem to be "on the other side" of the debate. Much more about that here, and on Mark Boslough's twitter (see below).

The long list of CRG "scientists and members" on their website includes co-authors of papers who have otherwise had nothing to do with the CRG, as well as people who were not asked if they minded being listed as CRG members, and when they found out they'd been so listed, were upset at being included. Very classy.

And finally, one of the co-founders of CRG, Allen West, is not actually an academic, does not actually have an advanced degree, and has in the past, under a different name, been convicted of selling fraudulent water studies to California municipalities despite not being a geologist. So, a con artist passing as an academic and geologist who befriended the other co-founders, became infected with their obsession with cometary impacts, and proceeded to reinvent himself as a cometary impact specialist.

An earlier version of the Tell el-Hamman paper appeared a few years ago. That paper, "The 3.7kaBP Middle Ghor Event: Catastrophic Termination of a Bronze Age Civilization" was a conference presentation at the 2018 annual meeting of the American Schools of Oriental Research. The ASOR (now called American Society of Overseas Research because someone realized their old name was racist, but they still publish a bulletin and hold an annual conference under their old name because, hey, still racists) dates back to 1900, when they were called the American School of Oriental Study and Research in Palestine. In short, they are the big leagues in the small (and sometimes dubious) field of biblical archaeology.

The ASOR presentation was by just 4 authors: Silvia and Steven Collins are both faculty at Trinity Southwest University. Ted Bunch is co-founder of the Comet Research Group, and finally Malcolm Lecompte, odd man out, is an emeritus faculty member at Elizabeth City State University in North Carolina. Lecompte has a website: he does not list the ASOR paper in his vita.

Looking at all that, it seems that the CRG, an organization devoted to proving that comets killed the mammoths and extirpated our distant ancestors 13,000 years ago, is Very Good Friends with a bunch of young earth creationist "archaeologists." But the CRG seems to have a rather Trump/Republican approach to science: declare your dubious findings as proven, then label anyone who disagrees as an enemy and refuse to cooperate with them. So, maybe not such strange bedfellows after all.

There's a lot more about the Tell el-Hamman paper's shoddy research and dubious claims to be found in Mark Boslough's twitter account (Boslough is an *actual* asteroid impact researcher). (Boslough is just one of many who are tearing their hair out over the paper, thanks to Robin Reid for bringing his twitter threads to my attention). Unfortunately Boslough has been posting his thoughts in several short threads, and not always remembering to link them together. Here's a starting point, but you may not always get continuation links (I ended up going to his main feed and scrolling down to find the next thread, but I started at a different point and he may have gone back and fixed things since then).

And this concludes our journey into the realm of fake science getting published in real journals and covered as legitimate.
glaurung: (Default)
Just a short gruntle about blind spots in archaeology today. I will do another post about the annoying viral twitter thread by Incunabula soon.

Archaeology is a very vexing science. On the one hand, it's amazing that we are able to figure out so many things about humans who lived tens of thousands of years ago. On the other hand, archaeologists are so timid in their approach, so unwilling to commit to any conclusion that they cannot prove by means of the stones and bones they dig up, and so wedded to certain theories that give them huge blind spots and force them to propound absurd conclusions, like the one underlying the dates on the arrows in the Americas for this map from Wikipedia.

We know that around 60,000+ years ago, ice age humans built boats and navigated across the water between Sunda (southeast Asia plus some of Indonesia) and Sahul (the rest of Indonesia and New Guinea/Australia). Ice age humans knew how to build boats. We know this for certain, because the people of Australia and New Guinea arrived there at a time when there was ocean between them and the rest of Eurasia, which they had to cross.

But change the context to the Americas, and suddenly the fact that humans were building boats and going around on the ocean 60,000 years ago vanishes utterly, and the conventional archaeological view is that the Siberian people who would go on to become the original Americans were land-dwelling people who walked across Beringia (the land between Siberia and Alaska) to North America 25,000 years ago, then cooled their heels in Alaska for 10,000 years until an ice-free route through the Rocky Mountains opened up, allowing them to walk, or rather, sprint, overland from Alberta to Tierra Del Fuego in less than 2,000 years.

This despite the fact that in historical times, with much more moderate weather, people who make a living in the arctic have often been boat-using people who get much of their food from the sea. If post-ice age people found the best way to survive in the far north was by going on the water, fishing and hunting whales and seals, why are we expected to believe that ice age people, facing a much harsher climate, would limit themselves to the food they could find on land?

Part of this absurdity - the idea that ancient Americans only arrived south of the ice sheets 16,000 years ago - is finally starting to crumble under the increasing weight of evidence for a much older human presence in the Americas - from the 23,000 year old footprints of children found in White Sands National Monument (published just this week, sadly paywalled), to 30,000 year old tools found in caves in Mexico (open access version), and on to many more studies going back decades. The archaeological community is very good at straining at gnats and swallowing camels, attacking any evidence that contradicts the accepted conventional theory as being misdated, misinterpreted, or not actually of artefacts made by humans at all. In other words, these findings of people in the Americas 20,000+ years ago are still, sadly, controversial.

And it doesn't have to be this way. We know that humans built boats, even if none of those boats have survived to be dug up. We know that the sea shore and shallows are one of the richest habitats on the planet for food-collecting peoples - a fact obscured by the way that those who collect food from the sea have been labelled as "fishers" while land based food collectors are called "hunter gatherers."

There is no way. No. Way. that inland dwelling food collectors of 60,000-ish years ago walked to the end of the land in Sunda, built boats to get to Sahul, and then walked away from the shore inland. They were fishers and boat people, they lived on the shore. They travelled up rivers, looking for more good fishing spots, settling on lakes and marshes away from the shore, and gradually over time, the descendants of those fresh water boat people adapted to a fully land-based lifestyle and filled the interior of the continent.

We know that the sea today is around 120 meters higher than it was during the ice age, which means that all of the homes used by the sea peoples of 60,000 years ago are now buried deep under the water, but that doesn't mean they did not exist. If ice age people were building boats in Sahel 60,000 years ago, why are we expected to believe that ice age people 25,000 years ago in Siberia did not know how to build boats and walked to Alaska, then were stuck there for ten thousand years waiting for a clear walking path through the ice to the rest of America?

Thankfully, some archaeologists are more sensible than the bulk of their colleagues and have begun advocating for the initial peopling of the Americas to have been done by boat people who, having fished and sealed their way along the coast from Siberia through Beringia to Alaska, skirted the ice sheets covering the shore of the Northwest Coast, hopping from one ice-free cove or island to another until they got south of the ice and had clear rowing (or sailing) to warmer parts further south.

I find it far more believable that a boat-based, marine mammal hunting and fishing people just kept going south in their boats, filling the coast throughout both continents, then gradually moved inland, than the mainstream theory that the entire continent was filled virtually all at once by land-dwellers who sprinted through a dozen different climates and biomes, their population exploding while they were constantly on the move, on foot, ever southward.
glaurung: (Default)
Eurocentric history tends to be very self-congratulatory. "(European guy) invented (arguably very important technology)," and "(historical change of import) happened because of (thing european or Greco-Roman people did)." It gets very tiresome, especially when the technology in question was actually invented hundreds or thousands of years earlier, far outside of the Euro-Greco-Roman sphere.

Today's example: a thread by Incunabula on Twitter. In James Burke-ian Connections style, he says that "Cheese 🧀 is one of the 5 things the Western book as we know it depends on. The other four are snails 🐌, Jesus ✝️, underwear 🩲 and spectacles 👓." Sigh. Burke's "Connections" blew my mind when I was eight. Because I was too young to notice how a shallow, facile and simplified, all white, mostly male narrative was being constructed from a far richer and larger history.

Today I'm tackling two of Incunabula's five things: cheese (parchment) and Christianity (the adoption of the codex). My primary source for most of what follows is an online version of "The Birth of the Codex" by Colin Roberts and T C Skeat, supplemented by Wikipedia and lots of blog posts/articles found via google. The online version of Birth of the Codex includes (in green and red text) incomplete edits and updates of the original book

1. "Cheese" (aka, parchment/vellum):
Read more... )
How much did Christianity's extremely early and wholehearted embrace of codexes have to do with the switch from scrolls to codexes? Not a lot. Christians were a tiny minority in a vast empire until well into the 3rd century. By the time Christianity became a major force in the empire (~300 AD), the switch to codexes was already underway. Clearly the rise of Christianity to the empire's official religion in the 4th century greatly accelerated the transition, but the evidence doesn't point to Christians being the starter of the trend.
glaurung: (Default)
When the "cities plant male trees because they don't want to deal with cleaning up fruit, and this makes allergies worse" meme made its rounds a while back, I didn't think much of it, but now that I've read this thread, I went and refreshed my memory about tree sex.

The meme is wrong, but the thread linked above swerves too far the other way and ends up wrong as well.

Actually, species that have separate sexes make up about 1/5 of the trees in the Eastern US. When planting trees of those species, cities may choose to go with male trees only, but probably don't bother unless the fruit/seeds are large and messy but the trees themselves are desirable (selecting for the gender of tree adds to the cost, and cities are loath to spend more money when they can avoid it by just picking a different, non-messy species or not bothering to screen their plantings for sex).

Things to remember:
1. Plant sex is complicated.
2. Dioecious is a fancy greek term for species that have separate organisms for male and female reproductive functions. Nearly all animals we're familiar with are dioecious. Among plants, dioecy is relatively rare. The "urban male trees are responsible for your allergies" meme is talking about dioecious species, which are a minority among trees (1/5 in the Eastern US, 1/20 worldwide).
3. The only pollen that's an issue for those with allergies is pollen from wind-pollinated species, which would be many trees, most grasses, and various herbs (Flowers that look pretty are not wind pollinated). The pollen from male dioecious trees that have been deliberately planted because they are male and will not litter the ground with fruit is a drop in the bucket of the overall pollen load.

Nerdy tree sex tidbits:
4. Non-dioecious plants cover a lot of variety. They can produce one flower that performs both sexual functions (cosexual). They can have separate pollen and seed producing parts on the same plant (monoecious). Or, just to be different, they can have some combination of cosexual and monoecious bits on the same or separate plants (polygamous).
5. monoecious and polygamous trees cover the gamut from having sexual phases of life (eg, making pollen when young and seeds when mature), to switching from making pollen to making seeds and back again every year or every few years, to being boring and producing both each year.
glaurung: (Default)
The default free software recommendation for cloning a hard disk, Clonezilla, is actively and aggressively user hostile, like many programs from the world of Linux. But it's free, and comes with the special sparkle of being open source, so it gets recommended a lot.

This has led me to have many bouts of swearing and struggling with balky upgrades than I can count. But it seemed all the user-friendly alternatives that I could find were for-pay products that haughtily refused to clone a boot disk unless you forked over tons of money for the pro version.

Now, I have finally, finally found a free trial app that lets you clone a system disk, with no fuss or bother about size, without having to jump through an intricate dance of copying each individual partition and hoping it would still boot when all was done.

Disk Genius is one of those rare birds, a for-pay app where the free version is actually extremely useful and capable. It's almost never recommended for partition wrangling and cloning, because it's marketed as a data recovery app... but what it actually is, is a universal, do-everything tool for dealing with computer storage devices and the data on them, including cloning, partition resizing, and recovering lost data, lost drives, etc. And now, finally, I can tell clonezilla and its user-hating interface to not let the door hit its ass on the way out.
Disk Genius: one of those apps that you wonder where it was hiding from you all these years. A+. Highly recommended.
glaurung: (Default)
Another thinky post. This one will be short. First, a shout-out to [personal profile] conuly who kindly drew attention to my error in the last post about food insecurity, and also gave me an invaluable new vocabulary term: "Food Collecting Peoples" instead of "Hunter Gatherers" does away with the "hunter" label that brings up sexist and inaccurate ideas about how those people lived.

I made a mistake in the first "extra stuff" note I put in the comments to the last thinky post about the invention of agriculture and whether inequality and war are necessarily linked to "civilization."

It's a very common assumption that food collecting peoples live a more precarious and food insecure existence than farmers - that they are more in danger of starving to death. This has generated not just harmless mistakes like my footnote but scads of bad science based on this assumption, such as the harmful "Thrifty Genotype" hypothesis among dieticians which assumes that gaining weight gain happens because humans are adapted to survive alternating waves of feast and famine, and thus self-starvation through dieting is the only proper way to address obesity (the more I learn about diet and obesity science the more I learn just how wrongheaded and discriminatory the entire discipline is).

In fact, food collecting peoples were *not* more prone to suffer food insecurity than farming peoples. Three separate analyses using the Standard Cross-Cultural Sample (TIL that there is a standardized data set for making cross cultural analyses, which is tailored to eliminate similarities due to cultural borrowing by limiting itself to cultures that are widely separated in space/time) found that this is in fact not the case. Comparing across all cultures in the sample, there's no difference in food insecurity between farmers and food collectors. If you control for climate (because arctic food collectors like the Inuit *are* more food insecure and in danger of starvation), then food collectors are *less* likely to be food insecure than farmers. Farmers are tied to their land and at the mercy of whatever happens to their crops, but food collectors can pick up and move to a different area, or simply switch to a different food source that was not impacted by the drought/flood/whatever (here is the only open access article of the three. It's the most recent and footnotes 19 and 20 link to the other two articles. CW: the article's discussion centres the obesity research angle).

The myth of food collectors' food insecurity is mostly born of prejudice (the assumption that life "in the state of nature" was "nasty brutish and short" dates back at least to the 17th century). Some of it is due to selective noticing of the data: famine and food insecurity was at least sometimes an issue for both food collectors and farmers. And some of it traces back to the artificially created food insecurity of people under colonialism and post-colonialism, which the colonizers always blamed on the victims rather than admitting their role. (having your land arbitrarily chopped into blocks that you're not allowed to go onto is not good for food collectors, even before we add the colonizers actively murdering them). Even today, most google hits for "hunter gatherer food insecurity" are papers and articles about how *former* food collecting cultures are suffering food insecurity now that they have largely ceased their own collecting practices and come to rely on food distribution by the nations in which they live. And, finally, at least in my case, some of the myth is due to wrongfully applying the special case of arctic peoples (the one climate where agriculture is impossible and food collectors do suffer from increased food insecurity) too broadly - I remembered reading about how Nanook (of the 20's documentary Nanook of the North) starved to death a few years after the film was made, which was explained as not unusual among the Inuit, and I took that as confirmation that I could accept the received myth and didn't have to google yet another fact.

So, with that myth debunked, why, exactly, did food collecting people switch to farming over most of the world? Farming is measurably worse by almost every metric: more work for the same or greater food insecurity, with more disease, worse nutrition (from a less varied diet), shorter lifespans, shorter adult stature, etc.

That farming made you more susceptible to disease was not evident to premodern people lacking tools to make statistical analyses, but at least some of the consequences of farmers' ill health *were* visible. An example off the top of my head (from the book 1491 by Charles Mann): early European accounts of First Nations people mentioned how healthy, tall, robust, and handsome Indians were compared to the malnourished, disease ridden Europeans. Most of the Indians in question were farmers themselves, but they had a broader, more nutritionally complete set of crops and, without domesticated animals, they were relatively disease-free. If the bigoted Europeans noticed and commented on the difference between better-nourished, disease free First Nations farmers and themselves, then food collectors must have noticed differences in health between themselves and the farmers whose technology they adopted.

First, in some places, food collectors didn't switch so much as get assimilated by farmers who moved into the area - genetic analysis of human remains from central Europe shows that the switch from food collecting to farming involved a genetic change, with an influx of people showing some degree of Anatolian ancestry moving in with their farming technology and mating with the local food collectors. But in other areas that genetic shift does not occur (the food collectors of the Baltic states, for instance, adopted the agricultural technology but did not interbreed with the people that brought it to them).

Second, depending on how violent that assimilation process was, people like those Baltic food collectors might have adopted farming in self-defence, regardless of the downsides.

Elsewhere, for instance in North America, there's clear evidence of agricultural technology diffusing without attendant migration, so: no assimilation or threat of assimilation. Why switch to a food system that required more work, had visible negative effects on the people who adopted it, and provided no real improvement in food security?

One common answer is that they were forced to by population pressure. (eg, Jared Diamond) This is Malthusian bullshit (another thinky post about Malthus being completely wrong will happen someday). Humans have always had the ability to limit their family sizes. Population only increased when technological change made it possible to reduce the land area per person. Look at the times between those technological shifts, and population remains extremely stable with little to no growth for vast stretches of time. There was no Malthusian pressure on food collectors to increase their food supply. Population increases happened after they changed their technology, not before.

Another answer I've seen mooted is that agriculturalists live settled lives and that enables them to accumulate more belongings and become richer than nomads. This overlooks the vast number of settled food collecting societies, where rich natural food sources meant people could live in one place permanently and own lots of things, without becoming farmers. It also overlooks that nomadic food collectors had a home range with which they were deeply familiar, and a limited number of home camps that they visited at more or less set times, depending on the season and what food sources were due to become collectable where. They could cache belongings at those camps and not have to limit themselves to what they could carry. So they weren't necessarily as poor and bereft of possessions as the popular conception of them would have.

A lot of the links I get when googling for reasons that food collectors switched to farming focus on the *invention* of farming, and provide the suggestion that this was done because settled food collectors in naturally rich areas (like the parts of the fertile crescent where wheat farming was invented) had to either become unsettled or invent new ways to get food when the place they had been living became less rich, whether due to climate change or over-exploitation. Which is not at all in accord with what we know about the actual time lines of plant domestication, extending as they do back to the height of the last glacial period, so that the food collectors were perfecting agriculture while the climate was improving and the richness of their homeland was increasing (see my previous thinky, linked above).

To restate the question: of the food collectors who had the choice to adopt already-invented farming technology (many desert/steppe dwellers and all arctic people did not have that choice, nor did those who adopted it under the threat of assimilation), some did not adopt the new tech, or resisted doing so until colonialism/invasion took away their choice (maybe because they saw evidence of the many downsides of farming). Others accepted the choice, despite those visible downsides: why? I still haven't found a reason proposed that sits well with me. But I do have a crackpot theory of my own.

Maybe, just maybe, it was because while agriculture did not provide any real benefit to settled food collectors, it did give the *appearance* of benefit. It gave the illusion of control: it made the people who did it feel that they were better able to ward off bad times because as a farmer, you were creating your own food, instead of being dependant on the forces of nature to provide food for you. Food collecting meant being at the mercy of countless factors beyond your control or ken. Farming meant being at the mercy of just one: rainfall. It wasn't actually better than food collecting, but it felt better, because *it was less scary*, and that's why it proved so popular.
glaurung: (Default)
Another thinky post. This one has been brewing for a while. Unless otherwise indicated, links are open access/not paywalled. (ETA: see comments for some interesting things/extra thoughts that didn't fit in this monster of a post)

I started googling articles about the origins of cities and agriculture due to my disaffection with the opening parts of Karen Armstrong's "Fields of Blood," where she assumes that military violence and structural inequality are necessary ingredients to creating a civilization, and links "civilization" (ie, inequality and warfare) to the invention of agriculture. That didn't sit well with my memory of Catal Huyuk, one of the earliest known cities, 6,000-ish people living in a town on the south coast of Turkey about 9000 years ago. Catal Huyuk had no city wall or other signs of military defences. It also had no temples or palaces, with minimal signs of social inequality. Just a community of thousands of people living in peace, trading obsidian tools for luxury goods from distant communities.

And then more recently I read an article about Gobekli Tepe, the oldest megalithic site yet found. It's a large worship complex in Turkey near the border with Syria that was built starting 11,500 years ago, by people who did not yet have domesticated plants or animals. And that turns the conventional model of the history of cities on its head.

The traditional model of prehistory, which Armstrong hews to, holds that hunter gatherers had extremely low population densities, with bands of 40-ish people occupying a large home range. In places like the fertile crescent, wild precursors to crops provided them with enough food that they could settle down and begin the process of inventing agriculture. That process is supposed to have begun around the end of the ice age, when human-influenced mass extinction of megafauna made big game hunting no longer a viable survival strategy, and when the climate became warmer, wetter, more stable, and generally congenial to the invention of gardening. Farming started out in small communities, and only over time as the total food surplus increased and trade fostered more specialized crafts did hamlets become villages, towns, and cities. But Gobekli Tepe was a fair sized community of (technically) hunter-gatherers existing before plant or animal domestication.

Which made me go pull Jane Jacobs' Economy of Cities off the shelf. Jacobs's book is a pro-urbanist argument that cities are central and primary to the economy of the surrounding lands, contrary to the tendency of urban planners, architects, economists, and other thinkers from the 19th and early 20th century to regard cities as a bad thing, an aberration of capitalist development that should be done away with when creating planned communities. But Jacobs starts the book by arguing that cities existed before agriculture, that it was urban living that created the necessary conditions for the development of agriculture - on the one hand, providing the necessary gathering of minds needed to spur intellectual ferment and technological progress, and on the other, creating a logistical problem of feeding an ever-growing urban population, to which animal husbandry and agriculture were the solutions.

Archaeologists dismissed that part of her book, and urbanists that praise the rest of it tend to glide in embarrassment over the first section. But now we have Gobekli Tepe. A community of hundreds (judging by how much work was required to build the megalithic temples) of settled, non-nomadic people who didn't have domesticated crops or animals.

Exactly how large the community was isn't yet clear - until very recently no houses had been found, and it was thought that people didn't live there full time, but just came together from the surrounding lands to build shrines, worship, and then dispersed to nomadic hunter-gathering again once a shrine was completed. Which makes little sense, of course, but archaeologists are very good at swallowing camels and straining at gnats when those camels and gnats are inconvenient to their preconceived ideas about what early people did and did not do. A few years ago, in the process of constructing a visitor's centre at the site, including a big permanent tent to shelter the excavated megaliths, they finally dug deeply enough to find houses, and the picture changed from a collection of temples in the wilderness to a more sensible one of a town of people who devoted a lot of resources to building worship spaces.

A town, but not really a town of hunter-gatherers, except technically. They didn't have domesticated animals and the bones of animals found in their kitchen waste show they hunted extensively, but Gobekli Tepe is in the middle of the part of the fertile crescent where agriculture was invented, and it flourished in the immediate pre-agricultural era, when people were perfecting their skills at raising and harvesting edible wild plants.

But first we need to backtrack. The myth of "man the hunter" refuses to die. Early people were not hunter gatherers but gatherer hunters - if they are anything at all like modern non-agricultural peoples, meat from hunting was only 20% of their diet. But because bones don't decay like plants, archaeologists see the evidence of hunting first, and evidence of plant gathering second and only if they use microscopes to examine the dirt adhering to stone tools and containers for tiny fragments of seeds. 19th and early 20th century archaeologists washed their finds, which eliminated the clues needed to verify what plants the people who used those tools had been processing. Archaeological techniques have advanced, and an article from 2010 reports that unwashed grinding stones from diverse sites in Europe showed signs that the owners has used them to process grass seeds and cattail roots, as far back as 30,000 years ago. That's deep in the depths of the last ice age.

We also know that ice age humans were harvesting the wild ancestors of modern domesticated crops. At Ohalo II, a cluster of 23,000 year old huts and hearths on the shore of what is now the Sea of Galilee (back then a single body of water encompassing the Dead Sea, Jordan River, and Sea of Galilee called Lake Lisan), stone sickles were found, as well as remains of wild wheat, barley, and other grains (the site burned, then quickly afterward flooded when the level of the sea rose. Charring followed by anaerobic conditions led to excellent preservation of organic material like plant parts. The site was discovered when a drought temporarily lowered the lake level several meters in the late 80's). Furthermore, a genetic analysis of plant parts found in archaeological sites across the Fertile Crescent indicated that humans had started to select against the genes controlling seed dispersal (genes dealing with the shattering of ripe ears of grain) up to 25,000 years ago for emmer wheat and 32,000 years ago for einkorn wheat.

Which throws the entire "agriculture happened when it did because the ice age was over" meme in the trash heap. Contemporaries of the people hunting mammoths were practising agriculture in the wetter fringes of the harsh desert that filled the middle east during the last glacial maximum. During the LGM, roughly between 31,000 and 16,000 years ago, the entire planet was much colder and much dryer than today (check out the vegetation map, and note that "Extreme desert" means less than 2% plant cover - basically like the dry parts of the Sahara desert today).

But, and a big but: that map is rather coarse grained and doesn't allow for rivers or microclimates. The Nile still existed, at a reduced rate of flow, and the upper Nile backed up behind sand dunes to create lakes in the desert where people lived and fished on the shore. The Tigris and Euphrates also still existed. Lower sea levels meant the Persian Gulf was a dry river valley surrounded by harsh desert. While the homes of the people who lived there are now underwater, the sudden appearance of people (paywalled) to either side of the gulf immediately after the sea level rose suggests that humans made their homes in the bottom of the gulf, then were forced to migrate into the less hospitable surrounding deserts when it became flooded. And in Palestine, the shores of Lake Lisan were fertile enough to support humans despite the entire area being "extreme desert" according to the map.

So, agriculture (in the sense of humans planting and harvesting grains like wheat) got its start not in the fertile Holocene after the ice had melted and the planet warmed up, but during the worst, most brutal parts of the last ice age. Forests were a lot sparser and more limited in extent during the ice age, but I don't doubt that where there were forests, humans were creating forest gardens back then as well.

And at this point we're starting to get far enough back in time to tangle with another mental block archaeologists have, and another extremely racist myth that I was peeved to see Karen Armstrong perpetuating: that "cognitively modern humans" got their start much more recently than anatomically modern humans. Something was lacking, the myth says, in Homo Sapiens before 40-50,000 years ago (Armstrong says 20,000, which is even more ridiculous, since we do have widely accepted and extremely well dated examples of "modern" tools, art, etc from well before then), something mental that suddenly changed, "coincidentally" around the time they started spreading out from Africa. Believers in the myth point to the seeming explosion in the diversity of human tools and artefacts around that time. Sceptics patiently dig up examples of "modern" tools from much older sites, and defend their reality as artefacts, their dating, and their interpretation against vehement attack. Eventually, one hopes, the mental block will be discarded for the racist claptrap that it is, the refusal to see early, mostly African humans as qualified to be counted as "human" in the same way as their descendants who left Africa.

This is related to the mental block archaeologists have against admitting that early humans could produce art, such that any obvious examples of art found that date before a certain time frame get ignored, attacked as not actually made by humans, or attacked as not actually as old as proponents think. Which brings me to another book (last one, I promise!), "Lost Civilizations of the Stone Age" in which Richard Rudgley doesn't address the myth of cognitive modernity directly, but attacks it by attacking the refusal to give legitimacy to artefacts and art that are deemed "too old to actually be art/advanced technology."

So, when did agriculture get its start? It's very hard to say, especially since many of the best sites for gardening during the ice age are now under the ocean. But clearly, it wasn't 12,000 years ago, and it wasn't in response to the extinction of megafauna or the moderating of climate as the glaciers melted.

Even if we restrict ourselves to people who had domesticated plants (the last 12,000 years), at least half of the age of agriculture happened before humans invented inequality, wealth hoarding, and empire building, the violence that Armstrong sees as inseparable from civilization. If we include the long period of gardening plants as they gradually became domesticated (and Gobekli Tepe shows that such gardening was productive enough to support towns of at least a few hundred people engaged in major construction projects), then the age of inequality and brutality that Armstrong equates to civilization has been around for only a quarter of the time in which humans had the ability to build towns and have food surpluses.

Inequality isn't inextricable from civilization, it's just one approach that has violently exterminated all the other approaches it came in contact with throughout history.
glaurung: (Default)
Famously, the Incas did not use mortar in assembling their stone buildings, which were so perfectly carved and set that there was essentially no gap between the blocks. Each hand-carved stone block was made to mirror the concavities of the block below with convexities of its own, so that once laid, the blocks interlocked, with each course resting in hollows in the course below - this made the walls extremely durable in the face of earthquakes. The stone masons took irregular rocks and made them fit together perfectly without doing away with the irregular shapes, resulting in striking patterns in the stone. (OTOH, they also made walls with rectilinear blocks; I think the thing was they used whatever rock was at hand. Loose rocks cleared from the construction site produced irregular shaped blocks, while quarried stone produced rectilinear blocks)

The outside faces of the walls were chiselled and sanded smooth for aesthetics, especially on more important buildings and high status homes. The inside faces were usually not cut or polished smooth, and there was not as much care to make the gaps between blocks as tiny as possible, except in the very highest status buildings.

But, I had trouble finding imagery that contrasted the pretty vs not so pretty sides of a stone wall, so I'm not sure just how much the inside walls were different from the outside walls. Decorative bas relief carvings were sometimes executed on the blocks. The Incan stonemasons were willing to work with whatever sized rocks were handy, from small blocks to megalithic scale boulders.

Looking for information about Incan stonework led me to a fascinating article that tries to untangle just what it was that the Incas used instead of mortar. We know from early Spanish accounts that they did use something, which Spanish eyewitness chroniclers did not properly understand but tried to describe anyway. Some said the Inca stone workers used a reddish mud. Others, that they poured molten gold and silver into the cracks between the stones. The Incas themselves said the stones were made to flow into place. The article assembles all these tidbits and suggests that the stonemasons were using mud from copper and tin mine tailings, in which sulfer-metabolizing bacteria would eat iron pyrite, producing extremely strong sulphuric acid. Chemical reactions between the acidic mud and the rocks would have given off steam, and that plus the glitter of fool's gold might have confused the chroniclers who wrote about molten metal. Acidic mud, the author argues, helped to temporarily soften the stone of the blocks, encouraging a more perfect, gap-free fit as the softened stone would flow ever so slightly as it re-hardened. Close examination of the joins between stones today shows a tiny discolouration that suggests such a chemical reaction took place. It's fascinating, and worth a read if you have the time. Otherwise, enjoy photos of incan masonry here.
glaurung: (Default)
I was reading about Machu Picchu and Inca stonework last week, and came across the hoary old colonialist talking point that the Incas did not invent the wheel. Being in a "go down internet rabbit holes" mood, I found myself reading various explanations for why precolumbian Americans, despite having wheeled figurines (see: image here and writeup here and early prototype pottery wheels (the kabal/molde, never scaled those figurines up or broadened their application to transport. Very few of the explanations sat well with me.

Read more... )Central American pottery wheels were still at the "spin the pot relatively slowly while laying coil" stage when the Spanish arrived. Not all the pieces had come together yet, and thanks to the conquest, they never would.

(extra bits that I found while writing this but that didn't fit are in comments)
glaurung: (Default)
Two books that have been on my mind recently.

I have slowly been working my way through Karen Armstrong's Fields of Blood, which is about the degree to which religions advocate for or support violence. It's basically an in depth "it's more complicated" rebuttal to the Islamophobic claim that Islam is a warlike religion, as well as to the atheist talking point that Christianity has been the cause of innumerable wars through the centuries.

Armstrong starts by noting that civilization, as traditionally defined, is founded in violence - the expropriation of food and labour from the poor by the ruling elite, on the one hand, and the destruction of the poor by the elite's soldiers in wars of conquest, on the other. Armstrong's expertise is the history of the major religions of Eurasia, but I found myself arguing with the book quite a lot in the early chapters where she relied on some out of date archaeology to talk about the origins of agriculture and cities.

Cities and civilization did not have to be founded on violence - we have the peaceful, undefended ruins of Catal Huyuk, an ancient city built without fortifications or other defences against attack, which also seemed not to have a ruling class of haves who stood above the have-nots. Kingdoms and empires, and the wars they engaged in, dominate traditional world history lessons because the warmongers conquered everyone else and got to write the histories. But they're not the only way our ancestors did civilization.

Recently I took a break from Armstrong to re-read another book on religion and war: Barbara Ehrenreich's Blood Rites. Ehrenreich is a journalist rather than a scholar, and as a historian I found things to disagree with all the way through her book, but the core thesis seems pretty solid:

1. War is an ancient activity, but not universal - it seems to be a cultural disease, a meme that infects cultures. Once one group arms itself and attacks its neighbours, those neighbours have to follow suit or be destroyed. And thus the war disease spreads through the world. Always, however, humans seem to talk and think about war in religious terms, especially calling the death of soldiers "sacrifice."

2. Sacrifice, in turn, while not much practised by modern monotheistic religions, was the bedrock of all worship, including monotheistic worship, throughout the ancient world, and (often in a tamed and vegetarian form), still is the bedrock of polytheistic worship everywhere. The gods demand not prayers but fresh blood and meat. They are envisioned to actually be feeding on the meat that is burnt upon their altars. Gods are carnivores.

3. The gods of the ancient world were not humanity's friends, but rather dangerous and cruel beings with violent and destructive inclinations. Gods do not protect humans from natural disasters, they are the cause of those disasters. Worship and sacrifice is all about appeasing them and preventing them from destroying humanity.

4. Humans are prey animals: our distant ancestors often ended up as food for leopards, and even in the era of agriculture and civilizations, big cats continued to hunt and kill farmers and shepherds as well as their livestock, until humans exterminated enough of them that the threat became rare. (Ehrenreich speculates that the architecture of ancient settlements that had holes in the roof accessed by ladders instead of doors - eg, ancestral pueblo cultures in the US Southwest and the inhabitants of Catal Huyuk - was about reducing the threat of predators coming into people's houses at night).

5. Our gods are predators, and the original sacrifice to them, before domesticated animals were a thing, was human sacrifice. Whatever it might have become since then, religion started out as an attempt to prevent disasters (including predation) by appeasing predator-gods with gifts of fresh meat.

6. Naturally the best human sacrifices are people who don't belong to the community making the sacrifice. And thus, war began, in the distant past, as a religious activity, as raiding parties that enabled one group of hunter-gatherers to appease their gods with the blood of members of another group of hunter-gatherers. While everything else about war has changed beyond recognition, the religious language used for it, and quasi-religious way of thinking about it, remain unchanged from its roots as a religious practice, raiding one's neighbours in order to feed one's gods.

Religion has of course since then become a lot of other things: returning to the themes of Armstrong, the core of most of today's major Eurasian religions (Armstrong sadly has never written about precolonial religions of Africa or the Americas, this is her great failing) is a quest to ameliorate human suffering, to fight against the grim truth that life is full of suffering and ends in death.

All through the founding documents of Judaism, Christianity, Islam, Buddhism, etc, are exhortations for people to help the less fortunate, be kind to their fellow humans, and to treat others as they would like to be treated. And those exhortations coexist uneasily with passages that depict the gods who are supposedly asking us to be nice to each other as violent, abusive, capricious beings that would just as soon destroy us if we don't feed them plentifully and regularly with sacrificial meat.

There's a huge gulf between religious institutions, which, in cahoots with the rich and powerful, seek to suppress dissent and keep the common people from disrupting the rapacity of elites; and spiritual movements, which have always been about helping the downtrodden and demanding that the rich share their bounties with those who have nothing.

Ehrenreich's book has a lot about warmongering elites who delight in war. Some of it falls afoul of her lack of expertise in the history she's covering, but I think it's interesting that while sacrificial religion seems to predate agriculture and the creation of "civilizations" which divide people into elites and commoners, the main proponents and perpetuators of war-based worship over the past 12,000 years have been those elites, while the main proponents of being nice to each other have been common people.

The elites of the ancient world delighted in being "hunters" - of literal animals, including big cats who prey upon farmers, and of their fellow humans, through war. Instead of worshipping predators, they became them. Blood sacrifice to capricious gods became monetary sacrifice to overlords who held all the wealth and power in society.
glaurung: (Default)
Another thinky thought post brought on by a video seeking to answer the question why Europeans enslaved Africans specifically.

And while the video didn't contain any misinformation, it felt a bit incomplete because I've recently read David Graeber's "Debt the first 5000 years," and because of a recent post on the Collection of Unmitigated Pedantry Blog which talked about slavery in the process of critiquing a world conquest strategy video game

Read more... )Lastly, have a table from Debt, showing just how miserably poor Europe was compared to essentially everyone else in Eurasia, even when comparing them to nations from centuries or millennia previous. Crappy climate > low agricultural productivity > low population densities > few and small cities > economic backwater.

Copying just one column of data from a table showing population and tax revenue for several ancient and early medieval nations:

Persia 350 BCE, 41 grams of silver per person per year
Egypt 200 BCE, 55 grams,
Rome 1 CE 17 grams,
Rome 150 CE 21 grams,
Byzantium 850 CE 15 grams,
Abbasids, 850 CE 48 grams,
T'ang, 850 CE 43 grams
France 1221 2.4 grams
England 1203 4.6 grams
glaurung: (Default)
Blog post 1: The Unmitigated Pedantry blog mentioned in passing today that while in medieval Europe, fortifications were built with thin stone walls which were very easy to destroy with the early, crude cannons of the 1400's. In China, on the other hand, fortresses were built with thick earthen walls lined with a thin layer of bricks, which were immune to early cannons.

Both places had access to the same kind of early artillery technology at roughly the same time, but in China, cannons were seen as a novelty of not much use. In Europe, the earliest, crudest cannons were a game changer, enabling the conquest of forts and cities without long sieges, leading to massive shifts in power as those who could afford cannons conquered their smaller, poorer neighbours, until the only nations left standing a few centuries later were countries that could afford the massive expense not just of cannons, but of building lots of all-new cannon-proof fortifications to defend their territories.

And this military transformation within Europe fed into other interacting factors to transform Western Europe from a poor backwater that was decidedly weaker than the vastly larger, more populous and far richer nations of Central and Eastern Asia, into a colossus of conquest that took over the entire world in the 18th and 19th centuries.

The question that the Pedantry blog did not address was why China built their forts so differently than Europe.

Which brings me to another blog post from last year: The Analog Antiquarian has been posting multpart essays about the 7 Wonders of the Ancient World for a while now. Sadly he is not a historian and sometimes uses old and outdated books as his sources, and I have found his novelistic approach sometimes offputting. But one thing I learned from his series a while back: archaeologists have never been able to find the Hanging Gardens of Babylon, of ancient clickbait fame ("You'll never guess what building is number six on our list of the 7 most awesome structures worth seeing in the world!").

The Hanging Gardens of Babylon seem to have never actually existed in Babylon. Although there are scholars who think something like what was described in the ancient lists did in fact exist in Nineveh. Although this just trades one mystery (where is it) for another (why did so many writers of the ancient world mix up two very distinct cities?)

But in the process of explaining the non-discovery of the Hanging Gardens by modern archaeologists, the Analog Antiquarian highlighted something I had already sort-of known: that ancient Babylon left behind very few ruins, because of its location. In the middle of a vast floodplain, quite far from any hills or mountains, with nothing but silt beneath their feet as far as they could dig, ancient Babylonians built everything, from hovels to palaces, out of mud brick. Which over the millennia, has completely eroded away into subtle mounds on the landscape, plus, sometimes, ceramic tiles that once decorated the outer layers of the walls of more elaborate buildings.

For instance, we have today a reconstruction of the Ishtar Gate. The wood of the gate rotted away, and the mud brick of the walls that flanked it eroded to nothing, leaving only the ceramic tiles which adorned it and made it splendid enough to get on the original lists of World Wonders (until a later revision bumped Babylon's walls and gates to make room for the Lighthouse of Alexandria). German archaeologists dug up the tiles of the gate in the 30's, took them home, and reconstructed the gateway: today you can see it in Berlin's Pergamon Museum (Nazi funding meets colonial archaeology, sigh).

Thinky thoughts produced: China, like Babylon, is a civilization centred on floodplains (the Yellow and Yangtze rivers), where stone has to be imported and the easiest and cheapest way to build fortifications is with earth. And naturally when China's rulers expanded beyond the floodplains, they stuck to known and familiar technology, continuing to build fortifications with thick earthen walls even when stone was available. So they never had the kind of thin masonry walls that primitive cannon were useful against.

Whereas the nations of Europe are mostly not centred on vast floodplains where stone is hard to come by. Stone was the first thing they reached for when they needed to build a fireproof fortification, until cannons made such walls obsolete.
glaurung: (Default)
This post brought to you by my brain refusing to stop chewing on a bit of esoterica that no one outside of Apple pundits gives a flying fuck about.

In the mid 90's, Apple had a near-death experience. The number of people willing to pay a heavy premium for a special nonstandard computer when you could get a much cheaper standard Windows 95 computer that did 95% of what the Mac could do, was plummeting. And yet the company continued to churn out a huge swath of different models of computers, as well as printers, proto PDAs called Newtons, and so on. Losses mounted and Wired published a cover story about the imminent death of the company.

Then in the late 90's Steve Jobs returned to Apple and amputated big chunks of the company, cancelling numerous projects and product lines. He condensed the company's output down to exactly four Mac models, expressed by a famous (in Apple circles at least) graphic:

Grid-of-4

(ID: four computers in a grid. At the top, labels "consumer" and "professional" and along the side, labels "desktop" and "portable", with a blue CRT imac, a blue powermac, three of the colourful imac laptops, and a dark grey mac laptop)

Now almost immediately this grid acquired some footnotes - the laptops and Imacs started coming in different screen sizes, and once you'd chosen a size you had to choose among low/medium/high end specifications for the processor, etc. But for most of the oughts, Apple made exactly four kinds of macs and it was very easy to tell which one met your needs.

By the end of the oughts, the grid had expanded, without anyone ever actually saying anything about it. The new, unspoken Macintosh product grid looked like this (image thrown together quickly with a meme generator because I was lazy, forgive the small size and low quality)

grid of 6

(ID: six computers in a grid. Columns labeled "consumer | professional | tiny" across the top. Images of an IMac, a Mac Pro, a mac mini in the first row, and a plastic macbook, a macbook pro, and the old rounded corner macbook air in the second row)

It had taken Apple a couple of false starts to get there (the powermac cube, the 12" powerbook), but by the end of the decade they had expanded into a new product category: tiny computers. For a while, the mac mini was the smallest desktop it was possible to buy. For a while, the Macbook Air was the only ultralight laptop with an almost fast enough dual core processor and a full size keyboard.

Ten years later, the grid of six became a grid of five. The entire laptop market had glommed onto thin and light, and the niche, expensive Macbook Air had become Apple's best selling Mac, their new mainstream base model laptop. Apple's Mac Mini hadn't changed in size much, but it was no longer the smallest desktop. The Macbook Air had competitors who were even lighter weight. Finally, Apple's "Consumer" desktops had become powerful enough that lots of professionals were using them, and many consumers were buying the more expensive "pro" laptops. So the categories need a renaming. Instead of consumer and professional, let's call them "mainstream" and "high end", with "small" a better descriptor today than "tiny".

grid of 6 2020

(ID: a grid of six, this time labeled "mainstream, high end, small" across the top. Imac, Mac Pro, and Mac Mini in the top row, and Macbook Air, Macbook Pro, and a question mark in the bottom row)

The complications and footnotes with this grid are all in the laptop category. While the larger and more expensive Macbook pro has always remained solidly in the "high end", the Macbook air has jumped from "small" to "mainstream" in that it's now the default laptop that most people buy. And the smaller size Macbook Pro is suffering a bit of an identity crisis, as it's split into low end and high end models, distinguished by the number of ports they have. The low end "pro" model seems better suited to being the "mainstream" choice and letting the Air go back to being Apple's "small" laptop.
glaurung: (Default)
(note: A lot of this is inspired by the Collection of Unmitigated Pedantry blog's series last fall on premodern subsistence farming, and especially their addendum on rice)

Premodern subsistence farmers organized their farms not around maximizing yield, but around doing all they could to ensure their family did not starve. Extended families of 8-ish people would farm just a couple hectares of land. Over time large farms would shrink as siblings each took their share of an inheritance, until they hit the minimum size farm needed to produce enough food for everyone. This is true of all premodern farmers, from China to Europe. Everything was organized around guaranteeing, as much as possible, that no one in the family would starve between one harvest and the next. Each family would would work several small fields scattered around the village where they lived, with each field in a different terrain with a different microclimate. If one area around the village had too much or too little water in a year, or if one hillside was blighted by disease or pests, everyone in the village would be a little worse off but no families would face complete destruction. Reducing the risk of starvation was the main priority, not producing a surplus of food to feed to non-farmers.

Staying alive was a community effort. If one farm was pillaged (legally by the aristocracy or illegally by bandits), suffered a sudden death, or had an unusually bad harvest despite its scattered fields, other families in the village would help out. It's common to talk about this in capitalist terms, but that projects modern economic concepts of money and debt onto a past that was not capitalist but communalist. Money debt and a market economy existed, of course, but it was imposed on the farmers and the village community from above by the wealthy and by the towns and cities that sold vital speciality goods to the farmers nearby.

All of that is universal regardless of what the farmers are growing. But the requirements of rice and wheat farming produced vastly different social systems and vastly different societies. Read more... )

In the rice belt of Asia, farmers did not have to pay their lords a fee in order to keep their families fed. They were self-sufficient in a way their wheat-raising counterparts were not and could not be. And at the same time their communities engaged in multigenerational projects to create more farmable land, projects that were simply impossible for their wheat-raising counterprarts in Europe. I don't know enough about the history of China and other rice-based nations to say much about the impact this had on the very different histories of the two regions, but it does give food for thought.

One last thing: traditional farming in Europe and America is all but extinct. Essentially no one still grows wheat in order to eat it themselves, and all farmers, even the Amish, are more concerned with producing crops to sell than with feeding themselves. Farming families work far more than two hectares, and they don't worry about divvying up fields into small bits to minimize microclimate failure. In the rice belt, on the other hand, modern farmers still farm in the traditional way. They use fertilizer and high yield breeds of rice which let them produce a large surplus to sell, but the essential system - of small fields created with vast amounts of labour, flooded and farmed with even more labour - remains the same.
glaurung: (Default)
I bought a G4 Mac Mini, because I thought it might be fun to mess around with classic mac software someday, and I didn't want to get something large that would take up a lot of room, so that meant getting a mini.

Naturally I had to upgrade the spinning hard disk in it, because hard drives suck. After much searching I discovered you can buy mSATA to 2.5" IDE adapters fairly cheaply, which enable you to put a fast SSD into a 2.5" laptop sized IDE case. I got one, got an mSATA drive, and was all set.

I performed the upgrade... and could not get the computer to boot from a CD. I tried cloning the existing OS to the new drive, and could not get the cloned drive to boot. Key combinations that were supposed to force a mac to boot from optical disk failed to work. After weeks of banging my head against this wall, I finally realized that the original drive in the Mini was set as a secondary IDE drive, rather than the default primary drive. Fortunately the adapter had pins for a jumper. I took the jumper off the old drive, put it on the new drive, reassembled the Mini for the 6th or so time, and... it worked perfectly.

Long ago, in the early oughts, I knew about IDE drives and jumper settings, and I knew that you could only have one drive set as primary at a time. But I had utterly forgotten about all that crap in the intervening decade. It doesn't help that most IDE laptops (and the mini is just a headless laptop) had two channels, one for the CD and one for the hard drive, so you didn't have to think about primary/secondary. But Apple made the Mini using the cheapest, most minimal possible combination of parts, which means one channel. Since the optical drive has no jumpers and is always set as primary, the hard disk has to be set as secondary.

None of the instructions I used, neither Apple's tech repair manual nor Ifixit, mentioned that you have to set the new IDE drive to be secondary. I looked online and found exactly zero of the top hits for Mac Mini G4 upgrade mentioned jumpers at all.
glaurung: (Default)
The Verge has a navel-gazing article about netbooks, those tiny, cheap laptops that were incredibly popular for a few years in the late oughts and then vanished utterly by the early teens. And by navel-gazing I mean that the author interviewed a few of his journalist friends, none of whom knew any more about the reason netbooks were popular than he did, and then wrote an article displaying his profound ignorance.

It's very simple, for those who are not narcissistic tech journalists: Netbooks were two things that both had a significant market, at a time when there was no other way to take the internet with you other than carrying around a laptop. 1. They were tiny. At a time when a regular laptop weighed five pounds and a big screen laptop six pounds, netbooks were just one kilogram. A netbook would fit in any old shoulder bag with lots of room for other stuff; a notebook required its own dedicated bag. #2, Netbooks were cheap. At a time when the cheapest full size laptops cost $600, and a decent thinkpad cost $1000, a netbook could be had for less than $300.

Size of course was a huge selling point. At the time, the only viable way to access your email and read the latest doings of your friends on Myspace and Livejournal was with a laptop. A tiny laptop that didn't need its own bag and wouldn't take up the entire surface of your table at Starbucks was vastly preferable, even if it was molasses slow and had a keyboard made for hobbit-sized hands. And of course, tech journalists and other professionals who needed to travel a great deal were always looking for a notebook that was smaller and lighter, so they wouldn't need such a heavy carryon bag. Some of them were even willing to put up with a crappy undersized keyboard to get that lighter carryon. Ultralight laptops had existed for a long time, but they cost a lot more than a standard laptop, and were hard to justify on a journalist's salary.

Cost was also a huge selling point. A $300 laptop made owning any kind of computer possible for the first time for a huge number of low income people all over the world who would otherwise never have been able to afford one. People who might as well be utterly invisible as far as narcissistic tech pundits are concerned.

Then in 2010 Apple came out with Ipads, on the one hand, and with Mark II of the Macbook Air on the other. And within a few years the entire technology industry followed in their footsteps as usual. Full sized but thin and ultralight laptops came down in price to $1000 or less, and siphoned off from the netbook market all of the professionals and writers who were looking for affordable-to-them small and light writing machines. Tablets and smartphones siphoned off all the people looking for devices to provide internet access which you could carry with you. Meanwhile, laptop makers started making full size laptops lighter and lighter, and selling them for less and less money, until the netbooks were left with no one willing to buy them.
glaurung: (Default)
I finally checked out some of the DC animated movies. The good ones were quite good. Sadly, neither of the Wonder Woman animated movies released to date were in that category.

Wonder Woman (2009) has a nonsensical villain - Ares is a god, and gods crave worship, so why the heck does he want to exterminate humanity and thus deny him any worshippers? But the real problem is that it oozes frat boy sexism, from a slimy Steve Trevor to an Amazon who turns against her people because she was denied the opportunity to marry and have children (bleah). And Trevor saves Paradise island from being destroyed, because behind every powerful woman there has to be a slimy man without whom the day would not have been saved. 🤬 Best avoided.

Wonder Woman: Bloodlines, on the other hand, is an incoherent mess of a film. The only explanation I can come up with is that DC had the script for a TV miniseries, but then a wild goat got loose, scattered the pages, and ate half of them at random. And rather than print out a new copy, they decided to just film the remaining half of the pages and call it a movie. It feels like a much longer story with all of the connective tissue removed. It never once stops to explain motivations, give characters a second to be themselves, or make much sense at all. Again, best avoided.

On the other hand... All Star Superman is a marvellous film that is just as good as the comic book miniseries it adapts. That's right, a film version that is fully faithful to the original material: something I would never have expected was possible from Warner Brothers.

Finally, Batman Soul of the Dragon is a fun love letter to 70's martial arts movies. Thankfully it makes Richard Dragon (DC's version of Iron Fist/white guy who becomes the world's best kung fu master), into an Asian man, thus reducing, slightly, the orientalist and racist nature of the material it's reworking. If it was all about Batman I'd not like it nearly as much, but he is just one of an ensemble cast of martial artists (and he is the only white person among them) who have to band together to defeat the bad guys. Plus, instead of being the best, Bruce Wayne is explicitly called out as the least skilled among them, which was I thought a nice touch.
glaurung: (Default)
About Joss Whedon and the shows he created... and about other creators who have done deplorable things.

There's a line in Lovecraft Country one of the characters speaks about their love of the racist ERB Warlord of Mars stories: “Stories are like people. Loving them doesn’t make them perfect. You just try and cherish them and overlook their flaws.”

But that's not quite right. You don't overlook their flaws. You love them despite of them... or else the flaws become too huge and you have to break off the relationship because of them.

And both reactions are fine. Art exists both in and out of the author's shadow. There's a part of it that gets tarnished by the author's attitudes and behaviours. And there's a part of it that exists beyond that. It's possible to have your enjoyment diminished by the tarnished bits and still adore the rest. It's possible to not be able to enjoy anything about the work anymore because the tarnish has eclipsed everything else for you.

Different people are going to have different responses, and that's fine. As long as we aren't denying or minimizing the gravity of the evils committed by the artist, there's nothing wrong with boycotting somebody's books forevermore, and there's also nothing wrong with continuing to read and enjoy their work for the untarnished bits.

(steps down from soapbox)
glaurung: (Default)
So, I came across a post about queer themes in Wonder Woman, Wonder Woman's war-era sidekick Etta Candy, and Dr Wertham. Which was so riddled with errors that I just had to write a post of my own (because comments were not enough).

Wonder Woman started out as feminist propaganda. Kinky, queer, bondage-obsessed, with a very different from modern ideas 19th century kind of feminism (women are not equal but different from men and women should be in charge because they will do a better job), but nonetheless, feminist propaganda. The queer kinkiness was filtered and coded of course by being published in comic books for children in the 40's, but it was still undeniably there.

Wonder Woman's sidekicks and Diana Prince's friends were Etta Candy and the girls of Beeta Lambda sorority at Holliday college. They were part of that propaganda message - promoting women's colleges, women's education and independence, and the idea that any woman can be a heroine like Wonder Woman if she puts her mind to it. Etta and her girls were also (coded, filtered) gay or bi characters, modelled on women that Marston's bisexual partners had known in the women's colleges they attended and the women's college sororities they had belonged to.

However, Etta was never Wonder Woman or Diana's girlfriend, even in subtext. From day one, the Wonder Woman comic adopted a genderswapped version of the Superman-Lois Lane dynamic, with Diana infatuated with Steve Trevor, who was infatuated with Wonder Woman.

Marston and his female partners co-created Wonder Woman and co-wrote each story, but sold them under Marston's name. When Marston died, the editors at DC refused to hire his uncredited women co-writers, and instead handed the comic over to Robert Kanigher, a typically sexist man who had no truck with all this feminist stuff.

Kanigher jettisoned the feminist messages that had appeared in every story, jettisoned most of the bondage themes, and jettisoned Etta and her sorority sisters. He kept (and enhanced) the eclectic, magic-meets-sf-meets-mythology-meets-fairy tales setting of Paradise Island, and kept the Diana-Steve-Wonder Woman love triangle. Because the love triangle was boring as fuck, he set a lot of his stories on Paradise Island. Without Etta and company, and without queer women co-writing behind the scenes, the comic became completely heterosexual, despite being often set on an island populated only by women.

Fast forward to 1953, when psychiatrist Fredric Wertham published a screed against violence and sexuality in comic books (expanded into a book the following year), which he felt were the root cause of juvenile delinquency and of the sexual irregularities of his child patients. Wertham's primary targets were crime and horror comics, but he did devote a little space to superhero comics like Batman ("a wish dream of two homosexuals living together") and Wonder Woman ("for boys... a frightening image. For girls... a morbid ideal"). Wertham's book states that it's based on seven years of research, which might explain why he called out the Holliday girls in Wonder Woman, as "gay party girls, gay girls" - despite the fact that Holliday college had been dropped from the comics for six years by the time his book was published.

Wertham was successful in virtually exterminating crime and horror comics, but he didn't actually have all that much effect on superhero comics - Bruce Wayne and Dick Grayson continued sleeping in twin beds in the same room together long after Wertham, and in the case of Wonder Woman, the censoring of gay themes had already been done several years before he came along.

Sources: Seduction of the Innocent, The Secret History of Wonder Woman (both on my shelf), various comic nerd web sites, and my own personal knowledge from having read tons of Wonder Woman comics, including reprints of dozens from the war years and a few from the post-war, post-Marston era.
glaurung: (Default)
I realized there was another list of GWG movies (in a paper book on HK cinema, rather than a website), and I found a few that I hadn't seen yet. Only the first of these is really any good.

Princess Madam (1989). Read more... )

Mission of Justice (1992) Read more... )

A serious shock! Yes Madam! (1993). Read more... )

The Avenging Quartet (1993) Read more... )

Madam City Hunter (1993). Read more... )

Profile

glaurung: (Default)
glaurung_quena

September 2025

S M T W T F S
 12 3456
78910111213
14151617181920
2122 2324252627
282930    

Most Popular Tags

Syndicate

RSS Atom

Style Credit

Expand Cut Tags

No cut tags