Writing the Great American….Textbook or Nonfiction book

I am extremely excited about the possibility of one day writing a book, so this post struck a chord, especially since it’s specific to academics.

 I’m saddened by the “wait for tenure” advice because that seems so far away right now. At the same time, I certainly wasn’t expecting to have the patience and experience to take on anything that involved yet, or even in the next 10 years.

 I suppose it’s good to know I’m on the right track: I’m writing articles and starting to write book reviews now, and hopefully will be ready do a longer feature in a year or so.

It’s daunting how many things are going to happen in the next ten-ish years, but reassuring that they are happening at a pace I can handle: The real job, the real place to live, the seeds of a book, perhaps the seeds of a family(!), and (hopefully) tenure.

So when am I going to take my backpacking tour and get my mohawk? I don’t want to ever call myself a grownup, even though I have mostly acted like one since I was 7.


Rupert Murdoch is a Socialist?

Well, compared to academic publishers, according to this story from the Guardian. (Fuller version here. )

“Everyone claims to agree that people should be encouraged to understand science and other academic research. Without current knowledge, we cannot make coherent democratic decisions. But the publishers have slapped a padlock and a Keep Out sign on the gates.
You might resent Murdoch’s paywall policy, in which he charges £1 for 24 hours of access to the Times and Sunday Times. But at least in that period you can read and download as many articles as you like. Reading a single article published by one of Elsevier’s journals will cost you $31.50″
Monbiot goes on:
“Of course, you could go into the library (if it still exists). But they too have been hit by cosmic fees. The average cost of an annual subscription to a chemistry journal is $3,792(5). Some journals cost $10,000 a year or more to stock. The most expensive I’ve seen, Elsevier’s Biochimica et Biophysica Acta, is $20,930(6). Though academic libraries have been frantically cutting subscriptions to make ends meet, journals now consume 65% of their budgets(7), which means they have had to reduce the number of books they buy. Journal fees account for a significant component of universities’ costs, which are being passed to their students.
Murdoch pays his journalists and editors, and his companies generate much of the content they use. But the academic publishers get their articles, their peer reviewing (vetting by other researchers) and even much of their editing for free. The material they publish was commissioned and funded not by them but by us, through government research grants and academic stipends. But to see it, we must pay again, and through the nose.
The returns are astronomical: in the past financial year, for example, Elsevier’s operating-profit margin was 36% (£724m on revenues of £2 billion)(8). They result from a stranglehold on the market. Elsevier, Springer and Wiley, who have bought up many of their competitors, now publish 42% of journal articles(9).”
What I wonder is how much this cost is really being transmitted to students? Is this actually a factor in driving up tuition costs? My guess is no, as from what I understand college tuition prices are mostly speaking to people’s desire for luxury items – by increasing the price tag, your college becomes more desirable to attend. Extra funds go to endless building projects, fancy gyms, and the like. (Unfortunately, this means some students will be pampered for four years but then be faced with crippling loans.) In short, I don’t think the academic publishers are to blame for this issue. 
The greater crime to me seems to be putting publicly-funded research behind a paywall. I wouldn’t say that the academic publishers are “evil,” more that they are taking advantage of the economics. They certainly share part of the blame for making the research out of reach, but no one is keeping them from doing so. 
Some open-access journals and websites are expanding. Scientist could choose to take action and only publish their work in these, perhaps inspiring others to leave the publishers. This would eventually increase the prestige of the open journals, create demand for new ones, and weaken the publishers’ grips. But currently in the natural sciences, there are several thousand journals compared to the paltry handful of open sites. Unfortunately, most scientists simply have to publish in the journals for sheer mathematical reasons. It seems a top-down solution is needed, since the economic drivers are not there yet. How about publicly-funded journals to complement the publicly-funded research? 

AWIS 40th Anniversary Meeting, Session Summary


The Frontiers in Sustainability Panel at the AWIS 40th anniversary conference consisted of three speakers, all from different areas of environmental science.

The first speaker, Ms. Kristen Graf, is the Executive Director of the Women of Wind Energy organization (WoWE). Ms. Graf started out as an engineering major intent on developing wind technology, but realized the technology itself is already developed. However, fossil fuels and nuclear energy still dominate the generation of electricity over renewable energy sources. The relative proportions of nonrenewables and renewables have actually remained fairly stagnant even with all of the media buzz and politicking surrounding renewables. This has happened despite wind power growing in popularity worldwide. Puzzlingly, this stagnation has occurred despite the increase in wind turbine production in the U.S. So why is this? These U.S. produced turbines are being installed in rapidly developing nations such as China and India. She addressed the need to keep some of the turbines here, both for the economic impact as well as the environmental impact. (She echoed Nancy Jackson’s remarks earlier in the day: if science jobs in the US are in peril, then improving opportunities for women in science in the US is bleak.) Ms. Graf’s interest in effecting this change in wind power usage led her to leave engineering and ultimately to WoWE. She stressed that there is only a small window of time left to act on issues related to global warming. Ms. Graf presented some real examples of success (Cornell and Denmark) in using renewables.

Dr. Helen White was the next speaker, and is an Assistant Professor of Chemistry at Haverford College. She presented the audience with an interesting environmental mystery that only a chemist could solve. Dr. White’s most recent research involved collecting samples from the Deepwater Horizon oil spill. While much of the media coverage of the event focused on surface tragedies, such as the plight of birds, the impact of the spill on the deep sea was rarely mentioned. In the deep sea near the spill, brown oil flocculates were observed. In the same area, biologists noticed deep sea coral with rare tissue damage: the polyps were literally falling off. The biologists attributed this damage directly to the spill. Dr. White was skeptical of the causation; there is a lot of oil found naturally in the Gulf of Mexico. In order to address this from a chemist’s perspective, she traced the source of the oil. After all, oil is not just one compound, it is a mixture. So oil from different sources will have different chemical signatures. After testing several samples using a novel technique called 2D gas chromatography, she determined the spill was not to blame for the coral damage. This result was unpopular with the biologists, but the evidence was striking. In doing this work, Dr. White found a new mystery to solve: she found some quite improbable chemical signatures in some of the oil samples. Her next quest is to uncover this mystery. She speculates that there may be undiscovered biodegradation pathways in the deep sea.

The last speaker was Dr. Cat Shrier, President and Founder of Watercat Consulting, LLC. Her company focuses on finding innovative approaches to sustainable water management. Dr. Shrier emphasized the need for the water industry to pay attention to the whole water cycle. While the main goal might be to deliver water to customers, issues such as water waste, integration with nature, energy cost of water, and energy production from water should be given higher priority. However, water utility companies tend to be very conservative and not supportive of public discussion. Dr. Shrier underscored the need to fight this, to create open spaces for education and public discussion. She mentioned her website for this, waterwonks.com, which will be unveiled soon. The conservatism and close-mindedness of the water industry naturally creates huge problems for women in this industry. While there are more women coming into the pipeline, there is a dearth of women in management positions. The industry is also self-regulated and there is no national association, making it difficult for these issues to even be raised. She stressed that these problems in the water industry are not history, they are happening now. Dr. Shrier believes the ultimate solution is to stop treating diversity as a nonessential, secondary issue. However, she admitted being unsure how to make this change happen.

All of the speakers touched on issues facing women in science that warrant a separate summary. Institutions, companies, and government all need diverse voices to make change and to be successful. We need more women leaders. As an example, we are woefully lagging behind other developed nations in the percentage of women holding an elected office. Women need mentors and to be mentors. Dr. White supervised an undergraduate woman for the research she presented. Ms. Graf mentioned a dark quote from Madeleine Albright, “There is a special place in hell for women who don’t help other women.” Further, we need men to be involved. Dr. Shrier noted that women cannot be solely responsible for their success or failure, as so much of career success has to do with standing on the shoulders of others. Lastly, all of the speakers testified to the importance of speaking one’s mind, whether it’s a call to action, an unpopular viewpoint, or bringing up a topic isn’t normally discussed.

A reflection upon hard drives

A few months ago I was having an issue: my laptop’s hard drive was running out of storage space. I was in the heat of a project that I couldn’t put on hold so every day I had to find a few unimportant files to delete. Just five years ago this computer was state-of-the-art and had seemingly endless storage: 100 gigabytes of data. But, like a large purse, I found ways to fill it. The documents and photos didn’t take up so much space, but my music collection took up a large chunk. At the time I didn’t sweat it as the 40 GB or so of free space was surely enough for an eternity. But my music collection bloomed. To compound this, about a year ago I started a video collection and it was only a few weeks before it was filled to the brim.

After my project was done I knew I needed to reflect on the state of my laptop. The big picture was sunny: The outside still looked nice and not outdated, I upgraded the RAM a couple of years ago to give it some spring in its step, replaced the battery, and the processor was still capable. So instead of getting a new computer, I swapped out the hard drive for a new one. The new hard drive fit snugly in the old one’s spot. It boasted 500 GB of storage. I now had an amazing 400 GB of free space (which has sadly since declined to 300 GB).

ANSWER: How much more storage we get per square inch of hard drive material compared to 1956, the advent of the hard drive. QUESTION: What is one billion times more?

Today, you can put 40,000 mp3s onto a drive that fits inside of an iPod. In 1956, if you put just 1 mp3 file onto a hard drive, you would need a forklift to move it anywhere. A well-known rule of thumb in the computer industry is “Moore’s Law”, which predicts that the number of transistors on an integrated circuit will double every two years. Translation: your computer gets about twice as fast every two years. Similarly, hard drive storage has Kryder’s law: the storage per square inch doubles about every 2 years. My hard drive problem was a beautiful illustration of Kryder’s law in action. In the five years since I bought the original computer, the available storage (100 GB) in a hard drive that size should have doubled at least twice, which is what it did (500 GB). This type of growth is exponential, and this mathematical trend has held since 1956. The fact that the trend has held so long is pretty amazing, but peel back one layer and the story gets more interesting. It’s not a natural progression of one technology. Every few years we hit the wall with the current recording technology and someone has to develop something better. But someone always does come up with something and the trend marches on. One of these transitions happened around 2005 when “longitudinal recording” reached its limit and was replaced with “perpendicular” recording. Unfortunately, we’re now at the limit of perpendicular recording.

To understand the source of this limit let’s dig in to what a hard drive actually is. The main star of any hard drive is the platter: It is a disk containing magnetic material and this is where the information is stored. It is called a “drive” as the platter is forced (driven) to spin so that a stationary read/write head can read/write data at different locations on the platter. (Tangent: “Flash drive” is a misnomer as there are no moving parts inside your USB stick. The name persists as hard drives came before flash “drives” and both types can be interchanged in many systems.) It is called “hard” as the magnetic material has a high coercivity, which basically means that once it’s magnetized, it’s hard to demagnetize it. So once your data is written it will be permanently stored on the disk, unless of course you make a conscious choice to erase or overwrite it.

The magnetic material on the platter is not uniform. Rather, small islands of magnetic material coat the surface in a random mosaic, separated by thin channels of a nonmagnetic material such as glass. This is called “granular” recording media, as the islands of magnetism are called grains. Each grain corresponds to one bit of information: when you save a file, the recording head assigns each grain a 1 or 0 by forcing the magnetization up or down. The current grain size is about 10 nanometers. So to increase the storage for a given platter size (and continue the exponential trend) we will need to reduce the size of these grains.

However, it’s hard to make these islands smaller given the current uncontrolled deposition process. “Bit patterned media” has emerged to solve this problem. Using the same precision fabrication methods used to make computer chips, scientists can construct nanoislands of magnetic material that are uniform and evenly spaced, each island corresponding to one bit. It has been shown to increase the data density by several fold to 500 GB per square inch [1]. This process is not optimized yet, so the production of this media is expensive and the quality is not up to par.

But even if bit-patterned media becomes more efficient to produce, there is still another hurdle. Making the domains smaller increases the likelihood that they will spontaneously demagnetize due to random fluctuations. One obvious way around this problem is to increase the coercivity of the material. But this creates a new problem on the other end: it makes writing more difficult as a stronger magnetic field is needed to seal in the magnetization. A clever solution has been reached, called thermally assisted recording. Basically, a laser is shined onto the high-coercivity material to heat it up. This has the effect of temporarily reducing the coercivity, so writing is easy. Once the data is written, the laser is turned off and the information is sealed into the cooled-off material, restored to its normal high coercivity.

When combined together, bit patterned media and thermally assited recording have been shown to increase density up to 1000 GB (1 terabyte) per square inch [2]. If this combination can sustain the exponential growth, then in 2020 we should expect densities of 10,000 GB (10 TB) per square inch. When you dive into the details, this means each magnetic domain will be separated by only 1 or 2 nanometers. This corresponds to only 10-20 atoms. Thus 2020 might mark the limit of magnetic recording; we just can’t get any smaller!

So will magnetic media survive? Will it be taken over by some other new thing? Solid state (Flash) drives are gaining popularity for file storage. They are more physically resilient than hard drives since there are no moving parts to break. However, they are still much pricier than hard drives and concerns still loom about their permanence compared to magnetic storage. There are some other good candidates, for instance, phase change random access memory (PCRAM) and spin transfer torque random access memory (STTRAM) [3]. But no matter what ultimately wins, magnetic recording media will be around for a long time as it is cheap, well studied, and very permanent.

On the other hand, the question of survival may be moot to the future average consumer. Cloud-based storage is becoming more prevalent and might take over as our main file storage system. It’s similar to our current monetary system: while you carry some cash on you, most of your money is in a bank and can be accessed from any ATM. So in the future, your laptop’s hard drive (“wallet”) may actually have less storage than it does today and you won’t need to carry so much data (“cash”) on you. Most of your files will be stored on a central cloud that you can access from any computer. So while the companies that own the cloud (e.g. Google or Dropbox) will probably care about which recording medium they use for their massive storage banks, you might have your head in the clouds about the whole issue.

[1] Mate, Mathew. “How new disk drive technologies are pushing the nanoscale limits of materials and mechanics,” MEAM seminar, UPenn, 1 February 2011.
[2] Stipe, B.C. et. al. Nature Photonics 4, 484 (2010).
[3] Kryder, M.H and Kim, C.S. IEEE Transactions on Magnetics, 45, 3406 (2009).