On Molly

Sep. 22nd, 2017 09:27 pm
[personal profile] marta_bee
I stumbled across some Discourse (TM) over at Tumblr regarding Sherlolly tonight. The details don’t particularly matter -- the people posting thoroughly earned the capital-D in that moniker, as I recall, and it was the slightly snappy attitude I’ve seen too many places in too many different fandoms to want to dwell on. But it did get me thinking about Sherlolly generally, and how it played out (or didn’t) in S4.

See, I love Molly Hooper, particularly in S1 & S2 but also as the show evolved. I love imagining her going to post mortems while in training and Sherlock just kind of crashing them, turning up with overboiled tea and Kitkats and MST’ing the heck out of the board of doctors grilling the poor physician. Maybe crashing on the couch in her office when he was between flats, turning on smarmy charm that she knows is fake, and it being kind of a game with her. Noticing odd behavior and telltale physical signs and insisting on absolutely no more thumbs or kidneys until he pees in a jar.

Science bros, is what I’m saying. I love the concept of Molly having a bit of a crush on Sherlock but that trending to a real platonic friendship that predates John and probably even Lestrade. I’ve always liked the headcanon that Sherlock has so much access to St. Bart’s because he was a postgrad researching biochemistry, so I can easily imagine them training together and taking different paths, and generally knowing each other like only fellow PhD candidates or their med-school equivalent really can. I can also see Sherlock being extremely protective of her as something separate from the realm where Mycroft has any kind of power because she’s a) not really operating in his realm, and b) come on, she was turned on by Sherlock flogging a corpse and dated Moriarty. Those colors don’t run.

... It’s just possible I’ve thought about this. A bit.

And this is where I feel a bit, not betrayed, but certainly like I've lost my bearings a bit in S4 and TFP in particular. Part of what I loved about Molly was she seemed to skirt the line of someone romantically/sexually attracted to Sherlock but who'd worked her way past all that. It's not that I can't enjoy fanfic about a romantic relationship, or I want to discourage people who ship that. But I do feel the show lost something vital about a friendship between those two, when it dipped into the romantic. Particularly when Sherlock's manipulated (even with the best of motives) to convince her to say the words. It really is a rarity, for a man and a woman to be allowed to stay friends. I wish they could have honored that. (Or still will, if the show comes back.)

And the irony's not lost on me. This is really similar to the arguments people have over shipping John and Sherlock, isn't it? Except literature is full of deep male friendships in a way it isn't for cross-gender ones, so it feels like something particularly unique is lost with the way this strand of the show played out. I don't know. Maybe this does just come down to personal taste.

How Charisma Makes Leaders Great

Sep. 22nd, 2017 01:22 pm
[syndicated profile] jstordaily_feed

Posted by Farah Mohammed

After the French election, a photo circulated the internet of three portraits stitched together: France’s young and handsome Emmanuel Macron; Canada’s political Prince Charming, Justin Trudeau; and from the United States, a scowling, red-faced Donald Trump. One American commenter wrote, “This is so unfair.”

Though it was meant as a joke, it highlighted how much importance is placed on a leader’s appearance and likability. It’s easy to write off charisma as a superficial quality—Trudeau’s detractors say his good looks and easy charm distract citizens from his actual leadership.  Yet, studies have proven that charisma isn’t necessarily only a distraction from good leadership; according to scholars Robert J. House, William D. Spangler, and James Woycke, charisma is in fact an integral element of good leadership.

Leaders need to be as good with people as they are with policy.

Both Macron and Trudeau are less experienced than their predecessors, but had a significant edge over competitors because of their attractiveness, charm, and energy. Great historical leaders carried countries through crises largely because of their personalities. Lincoln, Churchill, Gandhi, and Mao aren’t remembered just for strategic decisions, if they’re remembered for them at all. House et al. note that these leaders are remembered largely because of their innate ability to inspire others to follow them:

…charismatic leaders have an ability to understand and build on the needs, values, and hopes of their followers. They conceive and articulate visions and goals that motivate their followers toward collective action rather than self-interest. Thus, charismatic leaders generally use their power for the good of the collective rather than their personal good.

A heady dose of charisma allows leaders to squeeze the maximum efficiency from their administration, to give hope to communities, to inspire collective action and to unite divided ideologies.

The charismatic relationship consists of specific types of follower responses. These include performance beyond expectations; changes in the fundamental values and beliefs of followers; devotion, loyalty, and reverence toward the leader; a sense of excitement and enthusiasm; and a willingness on the part of subordinates to sacrifice their own personal interests for the sake of a collective goal.

This sounds powerful, and indeed, the cult of the personality can be dangerous if taken too far. Charisma isn’t, however, a free pass for leaders to act as they please. “Pure charisma is acquired through achievement and has to be achieved over and over again by the leader,” write House et al. (Macron is learning this the hard way, as his bold reforms are irritating both the left and the right, and his approval ratings are diving. Once the rebel darling of France, he’s now under fire for spending too much on cosmetics.)

Nonetheless, the world’s current problems are as sociological and psychological as they are logistical. Leaders need to be as good with people as they are with policy if they’re to do any more than simply survive their tenure.

“Modern organizations need cohesion, inspiration, and basic values,” write House et al. “Effective leaders provide these through their own values, their personal example, their enthusiasm, and their confidence in themselves and in others. They are effective because they are charismatic.”

The post How Charisma Makes Leaders Great appeared first on JSTOR Daily.

[syndicated profile] jstordaily_feed

Posted by Cynthia Green

Frescoes, perhaps the best-known wall paintings, are the result of a chemical reaction turning paint and wet plaster into a single, solid surface of calcium carbonate. The wall must be wet, hence the name fresco, which means wet, fresh, and cool in one Italian word.

Ideally, the entire painting is applied to the top coat of wet plaster called intonaco and they dry together—meaning during one day.  Pausing to consider the size of most frescoes—Masaccio’s Brancacci Chapel murals (1425-1427), Michelangelo’s Sistine Chapel ceiling (1508 – 1512) and the thirteenth-century walls of The Church of Saint Eutrope in Les Salles Lavauguyon—the process seems impossible. In reality, fresco painters painted in sections and interpreted “wet plaster” loosely.

Fresco painting splits loosely into Pre-Renaissance, Renaissance, and Post-Renaissance. Before and after the Renaissance, wall rewetting, adding binding agents to colors and various finishing touches were permitted, as this 1861 article explains.  But Renaissance purists found this fresco secco (dry fresco) unforgivably lazy. They developed a new standard called buon fresco (genuine fresco) that was much more rigorous.

Expulsion fresco

Detail of Masaccio’s Expulsion from the Garden of Eden (via Wikimedia Commons)

Pre-Renaissance, the fresco artist usually applied all layers himself. He began with the arriccio (rough plaster) on the wall, then drew onto the arriccio with charcoal, ochre, and red sinopia. After that, he spread the intonaco onto the drawings in large sections and let it dry. Painting with binder-enriched colors and egg or glue sealants began after the wall hardened. This was fresco secco. The result was beautiful, but somewhat opaque.

Buon fresco was possible because the Renaissance artist could hire an entire team to help. This allowed only spreading enough intonaco for a day’s work—about 1m2 (10ft2). Beginning at the top left-hand corner and working down, the artist traced lines and painted in earth colors. These dried “into” the intonaco for a more vibrant and durable result. Each day, new intonaco slightly overlapped the edge of the previous day’s, creating daily squares called giornate (“daily”).

The grid concept of enlarging drawings also enabled buon fresco. Alberti’s cloth “veil” and the later rete (net) replaced the sinopia method with the cartoon. Cartoons were actual-sized drawings on thick paper either needle-pricked, placed against the wet intonaco and dusted with black or red powder (spolvero), or traced into the intonaco with a stylus. Cartoon preparation happened in workshops. First, a team made small drawings and wax and clay models to study the effects of color, light, and shadow, then an oil painter prepared a colored sketch. It was a stressful labor of love, but even Raphael, Masaccio, and the impatient Michaelangelo suffered it because the result was so beautiful.

Understanding the complexity of fresco makes it even more remarkable that young Masaccio revolutionized European painting with his frescoes. The vaulted ceiling of his “The Holy Trinity” at Santa Maria Novella (1427) and his murals at the Brancacci Chapel of Santa Maria del Carmine (1425-7) show a depth and appreciation of perspective never before seen. He accomplished this at the age of 25, and within the strict time constraints demanded by buon fresco.

SaveSave

The post How Buon Fresco Brought Perspective to Drawing appeared first on JSTOR Daily.

oloriel: A few lines of Tengwar calligraphy. (blatant tolkienism)
[personal profile] oloriel posting in [community profile] silwritersguild
Silmarillion40 Banner

Fëanor Makes The Silmarils by Alikuu. Fëanor experiments with the light-retaining properties of Silima in the privacy of his forge. (Artwork)

The Creation of the Silmarils by Amyfortuna. Fëanor's creation of the Silmarils, in the form of a sestina. (Poetry)




Just joining us? The Silmarillion40 is a collection of fan-created fiction, art, and poetry in honor of the 40th anniversary of the publication of The Silmarillion. For forty days, we will feature at least one fanwork, following the chronology of The Silmarillion. Find the full Silmarillion40 collection here.

An Airplane in Every Barn?

Sep. 21st, 2017 05:00 pm
[syndicated profile] jstordaily_feed

Posted by April White

Henry Bomhoff probably should have been working on his tractor. It was just a few days before the wheat harvest in the spring of 1935, a tough year to be a farmer on the central Oklahoma plains. Dust storms and drought had already driven many farming families from the region, and wheat prices were plummeting again, but even a meager harvest was a welcome one in the depths of the Great Depression. Instead, though, Bomhoff—Heinie to his neighbors—was working on something unusual in his barn: an airplane constructed from cast-off pieces of farm machinery, a Model A Ford motor, and a D-I-Y kit mail ordered from a mechanics magazine.

April White

In History is Served, author April White guides us on a tour through the history of food.

“The day my little puddle-jumper was finished I climbed in at once to test the motor and taxi around the wheat field,” Bomhoff said in 1945. “The first thing I knew the tail of the plane was off the ground—then the whole thing took off. There I was in the air without any idea of how to get back on the ground. ”

“I guess I felt a little like the Wright Brothers when they made their first flight,” he reflected years later.

In a way, the comparison was apt. After he managed to land in a neighbor’s field, Bomhoff became known as “the original flying farmer,” whose curiosity and daring marked the beginning of a new, if short-lived, era of agricultural aviation.

The Department of Agriculture estimated that 3,000 army planes would become “air trucks.”

Bomhoff patrolled the skies in his homemade plane with a copilot and a shotgun. The already beleaguered Oklahoma farmers were at war with the area’s hungry coyotes, which saw the flocks of turkeys and herds of cattle as a food source. From the cockpit, Bomhoff believed, he could be a better rancher.

Soon, he replaced his makeshift wings with a Piper J-3 Cub, a light aircraft, and began to teach other farmers to fly, including at least 10 men who became U.S. Army pilots in Europe during World War II. Bomhoff was heralded as a “home front hero” in a 1945 syndicated comic strip, which rendered him as a slender, bespectacled man in a newsy cap. “Bomhoff has killed over eight hundred coyotes from the air,” it explained. “By reducing costly livestock and poultry damages, he is helping to relieve wartime food shortages.”

Bomhoff believed that the airplane had even greater potential: It could become the modern farmer’s tractor, an efficient way to herd cattle, survey thousands of acres of cultivated land, spread seeds and pesticides, and transport farm products to far-off places. The war had globalized American agriculture, as farmers sold to Allies around the world. Airplane manufacturers, too, saw a lucrative market; one advertised its planes on the back cover of Prairie Farmer by reminding farmers that, now, no spot on earth was more than 60 hours from the farm.

In 1944, Bomhoff got a visit from H.A. Graham, director of the Agricultural Extension at Oklahoma Agricultural and Mechanical College, and Ferdie Deering, farm editor of Farmer-Stockman magazine. The pair was inviting the state’s air-minded farmers, estimated at about 40, to Stillwater for the first meeting of the Oklahoma Flying Farmers Association. Bomhoff was elected as the group’s leader. The following year, the National Flying Farmers Association was incorporated, and by 1950, membership had grown to some 20,000.

The United States government was equally bullish about agricultural aviation in the post-war era. The Department of Agriculture estimated that 3,000 army planes would become “air trucks,” transporting produce from farm to city, and the Civil Aeronautics Authority encouraged pilots returning from the war to consider life as a flying farmer. “An airplane in every barn?” asked one Midwestern newspaper of the enthusiasm of the late 1940s.

But “the Flying Farmers’ vision of agricultural aviation never materialized,” writes historian Peter Simons. Ultimately, the mid-century enthusiasm could not overcome the economic realities. Owning and maintaining a plane became increasingly costly, distribution channels began to consolidate, and studies showed that spreading seeds and pesticides from the air was inefficient at best. Fewer and fewer farmers were willing to plow under profitable farm land for an air field or to give up their car for wings, as Bomhoff had in 1945. “If I had to let one of them go,” the original flying farmer had professed, ”it would be the car.”

 

 

The post An Airplane in Every Barn? appeared first on JSTOR Daily.

[syndicated profile] jstordaily_feed

Posted by James MacDonald

Given the popularity of tattoos, one would expect the physical effects to be well known. But in fact, the question has only recently been examined, most recently with studies that suggest that tattoo ink can leach into the lymph nodes, and that tattoos may reduce sweating. Reduced sweating impedes the body’s ability to cool off, potentially presenting problems for anyone who is heavily tattooed and exercising in the heat. But sweating aside, are there long-term risks to tattoos? Nobody really knows.

Many tattoo inks are chemically similar or even identical to commercial pigments used in printers or even paint. Are there serious adverse effects to injecting industrial paint under your skin? Nobody really knows. The largest regulator of food and personal items, the FDA, has authority over pigments used in external-use cosmetics, such as lipstick. Artificial pigments must be approved by the FDA and tested to ensure that they contain approved ingredients, but colors derived from natural sources are not tested at all. In practice, due to limited resources and a belief that cosmetics pose little health risk, approved cosmetic pigments are mostly regulated directly by the cosmetic industry.

Most tattoo inks are de facto unregulated.

There’s loophole, however, large enough for a body suit. The FDA only exercises oversight over cosmetic pigments used externally. Internal use, i.e. permanently inserting pigments into the skin, is not regulated by the FDA at all. In a bizarre catch-22, since the pigments are not FDA-approved for use in tattoos, and only FDA pigments are covered by the industry’s testing scheme, most inks are de facto unregulated (The FDA will act if an obvious health problem is identified). Furthermore, FDA ingredient labeling requirements only apply to products sold directly to consumers. Ink is sold wholesale in bulk to shops, so not only are the inks not FDA-approved, the ingredients are kept secret from users.

That leaves the regulation of tattoos to the states, where there is enormous variability in oversight. Many states, not all, have some regulations regarding the practice of tattooing, but there are few regulations regarding the contents or safety of the ink. Research regarding long-term effects of modern pigments or how pigments react when tattoos are removed is almost completely lacking.

As things currently stand, there is not yet evidence of long term harm to most tattoo recipients, nor does the sweating study provide evidence of risk. While there have been a few infections caused by unsterile ink, licensed artists are mostly conscientious and infection transmission through tattoos is uncommon. (The infection rate is higher in informal settings such as prisons, or friends’ basements.) However, the rate of allergic or other poor reactions has been increasing. Tattooing is an ancient practice, but the modern explosion of tattoo popularity and chemical pigments takes the health risks into unknown territory.

The post Why Doesn’t the FDA Regulate Tattoo Ink? appeared first on JSTOR Daily.

Stephen King’s Prophetic Early Work

Sep. 21st, 2017 11:00 am
[syndicated profile] jstordaily_feed

Posted by Matthew Wills

Stephen King celebrates his seventieth birthday on September 21st. The “King of Horror” has sold an estimated 350 million books. His work has been adapted for many feature films, television serials, and graphic novels. A new movie based on his 1986 novel It is currently the highest-grossing horror film of all time.

But as Douglas W. Texter notes, King hasn’t been given a great deal of respectful attention by academic critics. King’s tremendous commercial success, his specialization in the gothic-horror genre, and the “atrociousness of the films” made from his novels and short stories have all combined to leave King beneath the notice of much of academia.

Written under a pen name, The Running Man’s dystopian vision was largely swamped once it was revealed that King was the actual author.

Texter, however, finds much to take notice of. He uses as a case study King’s early novel The Running Man, written under the pseudonym Richard Bachman. Published in 1982, The Running Man was actually written in 1971, in the midst of the Vietnam War. It’s a dystopian satire, forecasting a grim U.S.A. in 2025, and borrowing “language and settings from Orwell and Huxley” and Joseph Heller’s Catch-22. The conflation of law enforcement and entertainment later seen in such television shows as America’s Most Wanted and COPS helps make The Running Man a “kind of ‘missing link’ between Orwell, Huxley, and Zemyatin (that attack state-produced oppression) and the postmodern work of Octavia Butler and Kim Stanley Robinson (that direct their fire towards transnational capital).”

Texter argues that the work’s dystopian vision was largely swamped once it was revealed that King was the actual author of the Bachman books in 1985. Then it became another King-branded horror show, ripe for adaptation.

The Running Man was the inspiration for a 1987 feature film staring Arnold Schwarzenegger and Jesse Ventura, who coincidently (or not) both later become state governors. Texter calls the novel “an early and edgy King work that was transformed into the very thing it both predicted and criticized—a reality television program featuring a nation-wide manhunt and huge cash prizes.” But Hollywood, in its inimitable style, transformed the novel. Texter notes that “about the only things that remain [of the transition from book to movie] are the title, the name of the leading character, and a game-show theme.”

Fourteen years later, Matt Damon and Ben Affleck proposed The Runner as a new television reality show. It was to feature a specially trained “Runner” attempting to evade a nationwide manhunt. Those who survived a month would win one million dollars. Folks at home could participate by registering as bounty-hunters. Stephen King’s Vietnam era satire had become another product of precisely what it forecast. Or it would have, had not the Damon/Affleck project been cancelled in the wake of the events of September 11, 2001. (King’s novel, by the way, ends with its protagonist flying an airplane into a skyscraper.)

Quoting Adorno, Texter notes that “works that are usually critical in the era in which they appear” are neutralized by the culture industry over time. Thus “entombed in the pantheon of cultural commodities” they are robbed of their sting. The career of The Running Man is a “perfect example of such neutralization,” one completed in a generation. Stephen King’s satire had become someone else’s show.

The post Stephen King’s Prophetic Early Work appeared first on JSTOR Daily.

oloriel: A few lines of Tengwar calligraphy. (blatant tolkienism)
[personal profile] oloriel posting in [community profile] silwritersguild
Silmarillion40 Banner

Bewildered by Dawn Felagund. Not all of the Eldar believed Valinor would be superior to Middle-earth. Having followed the man she loves from the land she loved also, Miriel struggles to cope with a strange life in a strange new land. (Short Story)




Just joining us? The Silmarillion40 is a collection of fan-created fiction, art, and poetry in honor of the 40th anniversary of the publication of The Silmarillion. For forty days, we will feature at least one fanwork, following the chronology of The Silmarillion. Find the full Silmarillion40 collection here.

(no subject)

Sep. 20th, 2017 03:06 pm
independence1776: Tallit (Jewish prayer shawl) (Jewish)
[personal profile] independence1776
L'shanah tovah u’metukah! For a good and sweet year!

Shanna Tova!

Sep. 20th, 2017 02:45 pm
[personal profile] marta_bee
I've been Dealing with the student loan people. Apparently a form I faxed in was never received. It was meant to renew y repayment plan (tied to my income so it needs periodic renewal), so the bills I received were way out of budget for me, and I kept calling and leaving messages after hours but never actually getting in contact with them, until they were threatening default.

Finally carved out some time and spoke to them, so that's now under control again. I think -- I'm not betting on anything until I see the next bill. But it feels under control, which has given me a bit of a mental restart on offline life generally. And the timing is really nice, too: it's Rosh HaShannah tonight, so a kind of new year starting again (or re-turning; the literally meaning of repentance). It's got me in just the right frame of mind.

A good and sweet year to those of you marking it, and a (hopefully) pleasantly chilly autumn evening to the rest of it.

Also: Here are some dolphins enjoying klezmer (Jewish jazz). It seems fitting.

Be Afraid. Be Very Very Afraid. II

Sep. 20th, 2017 09:54 am
la_samtyr: asian art drawing of sleeping cat (Default)
[personal profile] la_samtyr
Guys, CALL!! They're 2 votes from killing the ACA. Congressional staffers say they were inundated with calls before. Now they say they are getting THREE OR FOUR. Call. Call. Call. Every. Day. (202) 224-3121.

health care part 3
--The above was snagged from my fb page.
[syndicated profile] jstordaily_feed

Posted by Grant Shreve

In a recent interview with Slate, political scientist Mark Lilla remarked that Democrats have struck “a slightly hysterical tone about race.” Lilla’s breezy dismissal of America’s original sin is nothing new. What is new, however, is this use of the charged word “hysterical.” Whether Lilla knows it or not, hysteria and race have a long and unseemly shared history in American life.

Hysteria was a woman’s disease, a catchall malady for women who exhibited any of a multitude of symptoms, including paralysis, convulsions, and suffocation. Although diagnoses of hysteria date back to ancient Greece (hence its name, which derives from hystera, the Greek word for “womb”), it was in the nineteenth century that it emerged as a linchpin of modern psychiatry, gynecology, and obstetrics. According to Mark S. Micale, nineteenth-century physicians “considered hysteria the most common of the functional nervous disorders among females.” It was, wrote the prominent nineteenth-century neurologist Jean-Martin Charcot, the “great neurosis.”

Hysteria emerged in the late nineteenth century as a tool of patriarchal power and white supremacy.

But as feminist historian Laura Briggs demonstrates in “The Race of Hysteria: ‘Overcivilization’ and the ‘Savage’ Woman in Late Nineteenth-Century Obstetrics and Gynecology,” hysteria was also a racialized condition. More than just a woman’s disease, it was a white woman’s disease. American medical professionals in the 1800s who treated hysteria diagnosed the disorder almost exclusively among white, upper-class women—especially those who had sought higher education or had chosen to abstain from having children. From this data, they hypothesized that hysteria must be a “symptom of ‘overcivilization,’” a condition disproportionately affecting women whose torpid lives of luxury had made their nervous and reproductive systems go haywire, which, in turn, threatened whiteness itself. “The whiteness of hysteria,” writes Briggs, “signaled the specifically reproductive and sexual failing of white women; it was a language of ‘race suicide.’” Nonwhite women, on the other hand, because they were thought to be more fertile and more physically robust, were thus marked as “irreconcilably different” from their white counterparts, more animalistic and thus “fit for medical experimentation.”

It was in this way that hysteria emerged in the late nineteenth century as a tool of patriarchal power and white supremacy, a means of dampening the educational ambitions of white women and dehumanizing people of color, all under the elaborate drapery of scientific rigor and professional authority.

Although hysteria virtually disappeared from medical literature by 1930, it has had a long linguistic afterlife. It’s mostly used as a synonym for funny (i.e., “Last night’s episode of Veep was hysterical”), but it also retains some of its original nosological flavor when used in the sense of “uncontrollably emotional,” as Lilla did in his Slate interview.

Lilla likely didn’t intend to strike the pose of a nineteenth-century obstetrician when he said that “there’s been a kind of slightly hysterical tone about race” on the political left. Nevertheless, if words still mean things—and in this post-covfefe world, one hopes they do—then, wittingly or not, Lilla still resuscitated a pathological term of art with a long history of undercutting women’s aspirations toward autonomy and nonwhite people’s struggle for recognition and equal treatment under the law. Lilla’s choice of words was, at best, unfortunate. Attributing liberals’ social concern for the violence enacted upon marginalized groups to emotional imbalance minimizes a genuine sadness and an authentic anger. Even three decades after “hysteria” was deleted from the third edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-III), some of the word’s diagnostic power obviously still remains.

The post The Racialized History of “Hysteria” appeared first on JSTOR Daily.

Inventing the “Illegal Alien”

Sep. 20th, 2017 11:05 am
[syndicated profile] jstordaily_feed

Posted by Livia Gershon

The fight over DACA has put a spotlight on the strange position of undocumented people in the United States—people who work, study, and build lives in the country just like citizens but remain fundamentally different because of their legal status.

Mae M. Ngai explains how the concept of “illegal alien” came to the United States nearly 100 years ago.

Before the late nineteenth century, Ngai writes, U.S. officials generally approved of almost all immigration, which provided settlers and workers for a growing nation. In the 1870s and 1880s, Congress began passing laws excluding some immigrants, particularly those from China but also paupers, polygamists, and people with “dangerous and loathsome contagious disease.” Still, Ngai writes, “Little could be done if they evaded detection and entered the country.”

In the early years of the twentieth century, a million people a year immigrated to the U.S., but only two or three thousand were deported each year—usually after ending up at an asylum, hospital, or jail.

By the 1930s, the new Border Patrol was patrolling the southwestern countryside, using extra-legal violence and apprehending hundreds of immigrants at a time.

Things changed dramatically in the 1920s. Nationalism spurred by World War I, combined with stereotypes of poor Southern and Eastern European immigrants in the nation’s urban slums, created a new hostility to immigration. The Immigration Act of 1924 restricted legal immigration from Europe to 150,000 people a year and allowed for the deportation of anyone who entered after that year without a valid visa. Congress also created the nation’s first Border Patrol to limit entry across the country’s land borders, and it turned unauthorized entry into the country into a punishable crime.

The new emphasis on clear-cut national boundaries particularly transformed the area around the Mexican border. Although Mexicans were not subject to immigration quotas, to enter legally they had to pay taxes and fees, so many crossed clandestinely.

Prior to the 1920s, Mexicans had moved freely in and out of the Southwestern U.S., finding work building railroads and working the region’s mines and farms. But by the 1930s, the new Border Patrol was patrolling the southwestern countryside, using extra-legal violence and conducting sweeps that apprehended hundreds of immigrants at as time.

The expulsion of immigrants ramped up quickly, from 2,762 in 1920 to 38,795 in 1930.

With new laws came new language to describe “illegal aliens.” In 1925, in language familiar to anyone who follows the news today, the Immigration Service warned that the presence of people “whose first act upon reaching our shores was to break our laws by entering in a clandestine manner” was a “potential source of trouble, not to say menace.”

This idea—that the most important question about immigrants was their legal status, rather than moral character, willingness to work, or presence or absence of loathsome disease—was new, a product of an emerging twentieth-century world in which national borders were more important than ever before.

As Ngai writes, “This view that the undocumented immigrant was the least desirable alien of all denotes a new imagining of the nation, which situated the principle of national sovereignty in the foreground.”

The post Inventing the “Illegal Alien” appeared first on JSTOR Daily.

Meme follow up!

Sep. 19th, 2017 08:24 pm
moetushie: (a dame's a dame)
[personal profile] moetushie
Thank you for the questions, this was fun!

The list:

1. Yamazaki Sousuke from Free!
2. Maedhros from The Silmarillion
3. Matsuoka Rin from Free!
4. Aomine Daiki from Kuroko no Basuke
5. Rebecca Bunch from Crazy Ex-Girlfriend
6. Bodhi Rook from Rogue One
7. Steve Rogers from the Captain America movies.
8. Martha Jones from Doctor Who
9. Jason Todd from the pre-reboot DCU.
10. Bucky Barnes from the Captain America movies.
11. Kagami Taiga from Kuroko no Basuke
12. Victor Nikiforov from Yuri!! On Ice
13. Vanessa Ives from Penny Dreadful
14. Mary Crawley from Downton Abbey
15. Darius from Atlanta

I have so few active fandoms now that I really had to stretch back in my memory palace to find enough characters for this meme. Sad.

Questions )
angelica_ramses: (Default)
[personal profile] angelica_ramses posting in [community profile] silwritersguild
Silmarillion40 Banner

Paths to Good Intentions by just_jenni. Melian is bored and travels to Arda to see the Quendi. She meets Elwe. His people wonder where he is.




Just joining us? The Silmarillion40 is a collection of fan-created fiction, art, and poetry in honor of the 40th anniversary of the publication of The Silmarillion. For forty days, we will feature at least one fanwork, following the chronology of The Silmarillion. Find the full Silmarillion40 collection here.
[syndicated profile] jstordaily_feed

Posted by Kristen French

You might have heard that the Lyme apocalypse is upon us this year. In spring, media outlets from NPR to USA Today to the New Scientist were forecasting a black-legged tick population eruption with a consequent outbreak of Lyme disease in the American Northeast. Transmitted by tick bite, Lyme can cause symptoms such as fatigue, fever, headache, and a characteristic bull’s-eye skin rash called erythema migrans. If untreated, the disease can spread to joints, the heart, and even lead to neurological complications such as Bell palsy.

Today, Lyme is North America’s leading “vector-borne” disease—a term used to describe any disease transmitted from animal to human via live host. Despite decades of research and control efforts, new cases of Lyme in humans continue to climb. Confirmed cases reached a total of 28,500 in the U.S. in 2015 (plus an additional 9,600 probable cases). That’s more than double the number found when they were first recorded in 1995, according to the Centers for Disease Control and Prevention, though the trajectory has not been straight up, and increases may be partially related to heightened awareness. The number of counties in the U.S. that are considered Lyme disease hot spots has also more than tripled in that time, though the overwhelming majority of these is concentrated in just 14 states.

The problem is not confined to North America. Europe is witnessing a rise in confirmed cases of Lyme, and the disease is extending its geographic reach in both Europe and the temperate, forested parts of Asia. Some scientists believe now that the disease originated in Europe rather than in the northeastern U.S.—based on genetic sequencing of Borrelia burgdorferi—the Lyme-causing bacteria. The only known organism that doesn’t use iron to make proteins and enzymes, B. burgdorferi is particularly difficult for human bodies to kill because our immune system often tackles pathogens by starving them of iron. B. burgdorferi also lacks many other features common to bacterial pathogens, such as toxins and specialized secretion systems, which human immune systems use to detect and fight foreign invaders.

The black-legged tick is the only organism that can transmit B. burgdorferi between animals or between animals and humans. Ticks must have a blood meal at each of their three life stages to survive, so they climb onto their hosts from leaf litter or the tips of grasses or shrubs, attach their mouthparts to the host, and suck its blood slowly for several days. If the host animal has Lyme bacteria in the blood, the tick can ingest the pathogen and become infected, transmitting it to a new host at its next feeding, when the pathogen will rise from its gut to its feeding tubes. Once infected, the ticks stay infected for life.

Much of the media coverage of this year’s professed tick-a-geddon cites the work of scientist Rick Ostfeld, an ecologist at the Cary Institute of Ecosystem Studies in Millbrook, New York, and his wife and partner, Felicia Keesing, an ecologist at Bard College. The pair are veterans of Lyme and tick research, and have been predicting a Lyme plague in 2017 for two years. Ostfeld’s research suggests that incidence of Lyme can be influenced by, among other things, a two-year chain of events that begins with a so-called mast year, when all of the oak trees in a particular region yield a bumper crop of acorns in synchrony.


Sustainability & The Environment Newsletter

Put Earth first. Subscribe to our weekly digest of research-based stories about sustainability and the environment.

The term mast comes from the Old English word mæst, nuts of forest trees that litter the ground, especially those used for fattening swine. Scientists have speculated that producing a large crop of seeds at once provides an evolutionary advantage to certain kinds of trees: There are enough left over to take root even after seed predators have been satiated. The trees seem to spend almost all of their resources on reproduction in a mast year, leaving little left over for growth, but flip this equation in the following years. The lean years may then help to control predator populations who consume the seeds. Scientists believe that trees synchronize seed production in response to environmental cues such as rainfall or temperature.

It turns out that 2015 was a mast year, while the following year saw an explosion in white-footed mice in the region, according to Ostfeld and Keesing. It is worth noting that in his research, Ostfeld only found statistically significant correlations between acorns or mice and Lyme disease incidence in New York and Connecticut, and not in the other five states he studied. And yet, researchers in Poland have also found correlations between acorns and Lyme.

acorn bumper crop

In a mast year, the trees spend almost all of their resources on reproduction, yielding a bumper crop of acorns.

Here’s how it is thought to work: A bumper crop of acorns in the northeastern United States attracts white-tailed deer into oak stands in autumn and white-footed mice and eastern chipmunks the following summer—the animal species that are among the most common hosts for black-legged ticks. Adult ticks mate on the deer in the fall and lay eggs on the ground in the spring, which turn to larval ticks in the summer. These larval ticks feed on the mice and chipmunks attracted to the acorns, as well as small birds. The following year, infected nymphs may land on human hosts, transmitting the disease in the process. The year after that, adult ticks can feed on deer and humans, though humans are much more likely to be infected by nymphs, which are harder to detect than adults. Nymphs feed in spring and summer while adults feed in the fall.

Does Biodiversity Curb Disease?

To ward off tick bites, health officials advise keeping yards trim. Mowing the lawn, cutting back overgrown brush, and cleaning up leaf litter are all thought to reduce potential exposure to tick populations in residential areas. The irony, though, is that Ostfeld, Keesing, and other scientists believe the opposite approach may be needed for our forests if we want to minimize our risk of contracting Lyme. We should leave large forest stands intact, limiting fragmentation of habitat.

In a 2001 paper, Ostfeld argues that in residential areas forest grows in chunks that are too small and fragmented to support a wide range of species—especially species that prey on ticks’ most popular hosts. Those predators include wolves, opossums, and skunks, which happen to be poor hosts for Lyme, he says. So when we cut down forests, we end up with fewer predators and thus with more deer, chipmunks, and mice, increasing our exposure to ticks.

“If we avoid chopping it up, destroying, or fragmenting the forested habitat, then we will automatically maintain that diversity of animals, most of which serve to regulate Lyme,” says Ostfeld. He and others have come to name this phenomenon the “dilution effect.” Supporters of the dilution effect believe that biodiversity helps to lower disease risk for humans as a general rule. A recent meta analysis published in July 2015 in the Proceedings of the National Academy of Sciences found widespread evidence that biodiversity in both plant and animal populations inhibits parasite spread by regulating populations of susceptible hosts or interfering with parasite transmission.

According to Ostfeld, fragmented habitats are typically most welcoming for “live fast, die young” species that are good hosts for all kinds of disease. They can carry an infection like Lyme without mounting an immune response because it is more advantageous to expend their resources on predator avoidance than fighting an infection that likely won’t kill them before they reproduce. (In the case of Lyme, this hypothesis is complicated by research that suggests some mice do, in fact, mount an immune response while others develop symptoms of Lyme disease after infection.) Another take on biodiversity and Lyme risk was published in 2014 by Canadian scientists, who found that a lack of diversity even among tick host species—and not just fewer predators—can increase Lyme disease risk.

Interestingly, not everyone agrees that biodiversity protects against Lyme or other plagues. A 2012 paper critiqued such thinking as “Panglossian,” or blindly optimistic without regard to the evidence. Others have said biodiversity plays precisely the opposite role, encouraging disease. “The emergence of Lyme disease had to do with reforestation and increased biodiversity,” counters Durland Fish, professor emeritus of epidemiology at Yale University, who studies vector-borne pathogens and disease ecology. Just look at the Amazon, he says, one of the most biodiverse regions on the planet, yet a hotbed of pathogens. “At the beginning of the 20th century, there were no deer in the northeast….No deer, no ticks, no Lyme disease.” The deer arrived with reforestation in suburban areas, he says.

Did the Lyme plague spare us this year? It’s still too early to say. Thomas Daniels, director of Fordham University’s Louis Calder Center in Armonk, New York, who has been counting ticks in Westchester for decades, says it has so far been a pretty average year. “High tick numbers lead to more bites, more bites result in more [Lyme disease] cases, but as you can imagine, many things can influence that in a given year: weather, knowledge of ticks and Lyme risk, likelihood of taking personal protection measures, amount of time spent in tick habitat,” he wrote in an email. “We’re actually analyzing our data set to better understand the system we have, but, in general, we haven’t seen a steady increase in nymphal numbers over the years. Variation from one year to the next is the norm.”

The post Can the Acorn Crop Predict Lyme Disease? appeared first on JSTOR Daily.

[syndicated profile] jstordaily_feed

Posted by Livia Gershon

The Equifax hack revealed how vulnerable most of us are to the agencies that compile our credit records. Having “good credit” has been important to the way people judge each other since the dawn of commerce. But, as scholar Josh Lauer explains, the advent of credit reporting agencies in the early twentieth century reflected—and facilitated—a sea change in who makes those judgements and how they do it.

Up until the middle of the nineteenth century, Lauer writes, credit was personal. People got loans and ran up tabs based on interpersonal relationships and reputation within a community. As one wag noted in 1833, a debtor “is a man of note—of promissory note; he fills the speculation of many minds; men conjecture about him, wonder and conjecture whether he will pay.”

But growing companies and markets made credit a less personal matter, creating demand for a more objective source of information on a potential borrower’s creditworthiness.

Even as early as the the mid-twentieth century, credit agencies argued that keeping a good credit record was a moral concern.

In the 1840s, the first credit reporting firms created a beta version of the modern credit history report—though only for companies, not individuals. As retailers like butchers and department stores got a look at this credit monitoring model, they saw its potential value for the tabs they let customers keep.

And so, in the 1870s, consumer credit reporting operations began popping up. At first they were local and disorganized, and many shut down as quickly as they opened. The agencies began to come into their own in the early twentieth century. In this era of business rationalization, professional credit managers—often with experience in department stores and installment houses—developed best practices for gathering information and networks for sharing it.

By the early twentieth century, the need for impersonal credit information was obvious, thanks to a massive shift in daily life. In a 1913 talk, President Woodrow Wilson told the public that economic conditions had changed “absolutely, from top to bottom.” Now, he said, “the everyday relationships of men are largely with great impersonal concerns, with organizations, not with other individual men.”

Between the First and Second World Wars, credit reporting companies tried to boost public awareness of credit reporting, organizing big publicity campaigns. They called on individuals to “Treat their credit as a sacred trust” and argued that keeping a good credit record was a moral concern.

Over time, the credit record became increasingly abstracted, particularly in the 1970s and ‘80s, when records were computerized and consolidated into a few national credit reporting agencies.

But Lauer notes that our credit records remain as much a measure of our “character” as they ever were. It’s just that now we’re being judged not by people we know for the way we’ve treated them, but by faceless corporate entities for the way we’ve treated other faceless corporate entities.

The post How Credit Reporting Agencies Got Their Power appeared first on JSTOR Daily.

[syndicated profile] jstordaily_feed

Posted by The Editors

Well-researched stories from around the web that bridge the gap between news and scholarship. Brought to you each Tuesday from the editors of JSTOR Daily.

Lovecraft in the time of MAGA (Public Books)
by Gordon Douglas
A new series of comic books explores the racism-driven horror of H.P. Lovecraft. What can it tell us at a time when blatant loathing of the other is an ascendant political force?

Finding shelter from the storms to come (The Conversation)
by Dean Yang and Parag Mahajan
More hurricanes mean more people in need of a new place to live. But refugees only flock to the U.S. under particular circumstances, which raises questions about our immigration laws.

Can a computer really tell if you’re gay? (Slate)
by Sonia Katyal
A recent study claims that AI can distinguish quite successfully between pictures of gay and straight people. Here’s why we should look carefully at this kind of research.

Beyond brain-vs-gut (New York Magazine)
by Katie Heaney
When you’ve got a big decision to make, do you trust your gut or your brain? Social scientists say that question misses a lot about how human cognition works.

Lessons from Saturn (Vox)
by Brian Resnick
After 13 years, the Cassini has finished its epic exploration of Saturn in dramatic fashion. Here’s what the spacecraft has taught us about the planet, its rings and moons, and the universe.

Got a hot tip about a well-researched story that belongs on this list? Email us here.

The post Suggested Readings: Lovecraft’s Legacy, Hurricane Refugees, and AI Gaydar appeared first on JSTOR Daily.

Facing Ourselves Online

Sep. 19th, 2017 11:58 am
[syndicated profile] jstordaily_feed

Posted by Alexandra Samuel

Once upon a time, the “me” that I pictured in my head was the face that I saw in the mirror a few times a day.  But that me has long since been eclipsed by the versions of myself that I see far more often: the “official” version on my LinkedIn profile and personal website; the carefully selected “casual” face of my latest Facebook profile pic; the candid photos that have been posted by family, friends, or colleagues. As a person who spends a disturbing portion of her daily hours grasping a computer or mobile phone, I’m far more likely to see myself online than in a mirror.

In her biweekly column “The Digital Voyage,” Alexandra Samuel investigates the key psychological, social, and practical challenges of migrating to an online world.

All those online photos have become their own cultural imperative, with the effect of reinventing women’s faces—or at least, the way we curate them. When I was growing up, it was acceptable for people to demur or step out of the frame when a camera was produced; the phrase “I hate having my picture taken” was commonplace. Today, the group selfie is such a staple of social interaction that we seem to have accepted the inevitability of photography.

Our only recourse, in the face of constant snapshots, is to ensure we are always camera ready. “The seventy-five women in the entering freshman class of the School of Business of the College of the City of New York are enjoined from wearing lipstick, jewelry, or any other of the fineries distinctive to them,” reported The Journal of Education in 1931.  To issue such an injunction today would be unimaginable, particularly for a generation that has forged its aesthetic of self-presentation in the fire of Snapchat.

If the visual culture of the internet has imposed new beauty standards, it has also offered a helping hand in meeting them.

This relentlessly photographed existence means it’s never been more important to be “on fleek”—a phrase that first entered the communal lexicon when used to describe eyebrows in a video selfie. And indeed, eyebrows are emblematic of the way our self-presentation (and indeed, our self-concept) has evolved through the combined visual pressures of Instagram, Tinder, and the ubiquitous profile photo.

In recent years, I’ve become obsessed with what I think of as “New York eyebrows”: you know, those incredibly groomed brows that shoot out into darkly coloured, sharply defined tips. You might spot on a 20-something woman anywhere in America, but are particularly widespread on the isle of Manhattan. The first person I ever saw with this look was a colleague visiting from New York, and every time she visited our Vancouver offices, her eyebrows were the subject of water-cooler discussion for weeks afterward.

But it turns out that these in-your-face arches aren’t called “New York eyebrows” by anyone but me: they’re known as “Instagram brows.” That’s because the driving imperative behind eyebrows that could stop traffic is to win traffic and attention for your online selfies. In today’s world, you are your online presence, which means you are your headshot. And since everyone knows that your eyebrows are the frame for your entire face (hello? Makeup 101?), that means your eyebrows are the foundation on which your entire existence is based. Egads! I have to go grab my tweezers right now.

The urgent need to get my brows camera-ready would come as no surprise to the author of the 1957 paper, “Themes in Cosmetics and Grooming,” the aptly named Murray Wax.  Wax observes that “changes in dress and grooming are universally employed to denote the movement from one social status to another (infancy, childhood, sexual maturity, marriage, maternity, ability, death) or the assumption of special office (chief, priest, medicine man, Doctor’s degree).” Since even I can’t pretend that “Instagram user” constitutes a “special office,” I’m forced to acknowledge that my metamorphosis into selfie-ready internet icon is a movement from one social status to another. Or at least, so I hope—if I can just get these eyebrows right.

If the visual culture of the internet has imposed new beauty standards, it has also offered a helping hand in meeting them. I’m talking about beauty blogs, of course, which offer me a lifeline whenever Facebook ads try to sell me on the merits of some new cosmetic upgrade, like the latest one: magnetic eyelashes. “Attaching to what?” A friend asks when I post about them on Facebook. “Metallic contact lenses?

A lash-by-lash account on PopSugar informs us that the lashes attach to each other—right, that makes slightly more sense now. Still, is it really a good idea to stick magnets in your eye?  If online beautification demands such extremes, perhaps it is because we are being exhorted to achieve mythical levels of self-transformation; just as “Oedipus effaces his former identity and transmutes himself into the blind seer” by blinding himself (or so says David McDonald in Theatre Journal), I can stick something in my eye, discard my offline husk, and transmute myself into an internet beauty.

Our devices are gradually migrating us to a world in which the norm is to see ourselves the way other see us.

What ensures I don’t end up in the same position as poor old Oedipus is the endless array of how-to beauty videos. As another Facebook friend points out, the videos are where you learn the dire consequences of trying to apply your magnetic lashes with metal tweezers. (Or where you discover that they’re a recipe for hilarious frustration.) Indeed, there is little you can’t learn about the ins and outs of self-beautification by hunkering down for an evening of cosmetics how-tos on YouTube.

In a 1992 article on makeup videotapes that anticipated the YouTube phenomenon by a couple of decades, Judith L. Goldstein observes that these make-up lessons are teaching more than the right way to hold a mascara wand:

the aim of the make-over tapes is a double transformation: first a transformation of faces from less ideal to more ideal (both the faces in the video and the face at home), and second, a transformation of the gaze of the viewer. The gaze must be transformed from innocent (or uninformed/unformed) vision to critical judgment…..It is this discerning eye which the student of makeup learns to acquire. Although she is explicitly learning a procedure—the application of makeup—she is also more implicitly learning to convert the innocent gaze to one of critical judgment. The makeup artists are selling their skills and their makeup products, but they are also selling the discerning eye.

By turning to the screen—rather than the mirror—that “discerning eye” transforms the way we see ourselves. This is a literal description as much as a metaphorical one, because mirror-users become accustomed to the version of themselves that they see in the mirror, in which left and right are flipped. So accustomed, in fact, that (as Roy Sorenson points out in “The Aesthetics of Mirror Reversal”) “[p]sychologists have found that people prefer mirror reversed photo-graphs of themselves because that is how people are used to seeing themselves in mirrors.”

For the most part, however, that’s not what our cameras give us. Sure, your smartphone may show you a mirrored image when you’re lining up your selfie, but take a look at what’s captured to your camera roll, and you’ll see yourself the way everybody else sees you. You can use a photo editor or tweak your phone settings so that your selfie is stored as that comforting mirror image, but by default, our devices are gradually migrating us to a world in which the norm is to see ourselves the way other see us.

From those Instagram photos your friends tag you in, to the selfies you upload yourself, you are steadily accreting a self-portrait in which you gaze back at yourself through the eyes of the other, just as Judith Goldstein predicted. Sure, the eyes of the other may have fabulous lashes and on-fleek brows, but they are still alien; still separating us from ourselves through the lens of a smartphone camera.

No wonder that camera ensures we are constantly curating our faces. Online makeup tips may come to our rescue in that process, but at the cost of invoking a critical gaze that leads us to to see ourselves rather than to be ourselves—a shift that is only compounded by the way smartphone photography moves our mental self-image away from what we see in the mirror.

Yet this is one transformation that is more than skin deep. As we shift from a mental self-concept that’s grounded in what we see in the mirror (an experience of self-perception that takes place largely alone) to one that is grounded in how others see us through the camera lens, our self-image ceases to be private and individual. The photographic pressure to curate our faces is inextricable from the online pressure to curate our lives; to present and perform, instead of simply living.

Say cheese!

The post Facing Ourselves Online appeared first on JSTOR Daily.

When Packrats’ Hoards Are Helpful

Sep. 18th, 2017 11:00 pm
[syndicated profile] jstordaily_feed

Posted by Matthew Wills

The Neotoma genus packrats have lent their name to humans who won’t throw anything anyway. These rodents, also known as wood rats, were aptly chosen as a comparison. They make stick nests that they jumble up with food, waste, and other debris, as well as the things they collect. When they live near humans, their collections can include silverware, shoes, bits of clothing, newspapers, even pieces of the traps set out for them. They’ve also been called trade rats because they “trade” an object on the way home for something they like better. In PrairyErth, William Least Heat-Moon wrote “campers have awakened in the morning to find a pocketknife or compass traded for a pinecone or deer turd.”

These treasures can remain in their nests for a rather long time. Biologists have been studying the last 50,000 years of plant, animal, and climate interactions in the American Southwest by studying the remains (middens) of these packrat nests. These middens can provide a record of climatic changes, vegetation profiles, and animal life during this period.

The remains of ancient packrats’ nests can be read as localized climate change indicators.

Here’s the thing: packrats sometimes pee in their nests. The urine crystalizes as it dries out, helping to solidify and preserve the materials of the nest in the arid terrain of the rats’ habitat. Fossilized packrat middens thus become inadvertent hoards of pollen and other plant remains, as well as fragments of insects, reptiles, and even other mammals that may have wandered in. (DNA can now be used to identify such fragments down to the genus if not species level.) Since packrats tend to collect material from a limited range around their nests, their middens can be read as localized climate change indicators.

Steven D. Emslie, Mark Stiger, and Ellen Wambach write that the urine-solidified middens are sampled with rock hammers. These are then rinsed and soaked in water to remove the urine; the cleaned and then dried fragments of seeds, bark, twigs, leaves, and solid feces (used for radiocarbon dating) are sorted and identified.

In their study, Emslie, et al. took microbotanical remains from seventeen bushy-tailed packrat nests in the Upper Gunnison Basin in Colorado to reconstruct environmental changes during the last three thousand years. They chart a rising and falling of the timberline and the appearance and disappearance of various pine species (some like ponderosa and lodgepole no longer found locally). Bristlecone pine moved into cooler, dryer lower elevations during cold periods, for instance between 1500-1850 (sometimes called the “Little Ice Age”).

By studying the response of plants and animals to differing temperature and rainfall regimes, paleobiologists can offer models for the rapid disruptions we’re seeing now because of the earth’s rising temperature. Pollen and dendrochronology have become standard markers for such re-creations of historical environments. Packrat middens offer another source for tracing the varied careers of plants and animals through time.

The post When Packrats’ Hoards Are Helpful appeared first on JSTOR Daily.

August 2017

S M T W T F S
  12345
6789101112
131415161718 19
20212223242526
2728293031  

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Sep. 23rd, 2017 01:52 am
Powered by Dreamwidth Studios