Author Archives: bob

Social Distancing and Internet Access

The titanic impact of COVID-19 is driving us to an increasingly digital existence. People that can, work from home. Many if not most universities have shifted to online classes. In some locales, K-12 students need access to the internet.

Electronic data shared between physicians, clinics, and hospitals is greatly aiding the sharing of information about the impact of our pandemic and how we can manage it. Broadband access is now not just a luxury but a necessity of life in the age of a pandemic where social distancing is of utmost importance. A real problem exists with rural areas however because it just isn’t there in many places.

Throughout the previous century and into the 21st, there has been a gradual population shift from rural to urban locales. Early on this was dominated in a shift from subsistence farming to a reliance on cash crops. Later, it was driven by the mechanization of farming technology.

Rural electrification bolstered the success of rural life. President Roosevelt signed an executive order in 1935 which was followed later with legislation creating the Rural Electrification Administration. Were it not for this act, life in rural areas would have disappeared even faster. Electrification brought some parity to rural life compared to life in the cities.

As we now rapidly transition to the age of the internet, there is a new form of disparity between the cities and rural areas. Access to broadband internet is becoming essential to both learning and earning in contemporary society. Increasing numbers of jobs depend absolutely on broadband internet. With quality internet access, many jobs could come back to rural areas. Rural life is inherently attractive to many but there has to be an income source

The value of broadband internet has been recognized now and even the smallest schools have access. But what about when the children go home? Not so much. The best method for broadband is fiber optic cables but the cost for rolling out the cable is unattractive to commercial entities. Broadband can be delivered via a cell phone signal to many rural areas, but again low population densities mean low income for private investment.

The Ozarks present a particular difficulty because of the topology, deeply cut serpentine valleys mean even more towers are necessary for complete coverage. It is time to consider a significant effort to support bringing broadband internet to rural areas, just like rural electrification. In fact, the electric coops could act to broker the delivery. The poles to string cables are already there. It would require an expansion of the skill set for the coops to manage internet connections, but that in itself would bring jobs back.

It’s time to bridge the digital divide and bring our rural areas into the twenty-first century. Children at home need access to high-speed internet. Modern home security systems require connectivity, even many personal health notification devices for the elderly require access.

We will get through this pandemic but we need to redouble our efforts to keep all of our society connected via broadband access. Everyone, both urban and rural needs to included in our civilization.

Dr. Bob Allen is Emeritus Professor of Chemistry at Arkansas Tech University.

Dealing with an Epidemic

Unless you live under a rock, you are at least aware that we have a viral infection rearing its head in the United States. Whether you call it an epidemic or a pandemic is immaterial. It began most likely in a market in Wuhan, China where any number of wild animal meats were on sale. Bats have been suggested but it isn’t yet clear.

The infection due to this virus is called COVID-19, as it is a member of a group of viruses known as corona viruses and it appeared in 2019. The virus itself has been given the name SARS-CoV-2 – short for Severe Acute Respiratory Syndrome second corona virus.

The first response by the government has been to close our borders to countries where an infection is already established. This response was too little too late. It appears the virus has been circulating in the United States for weeks now. There are reported cases in 15 states and 6 known fatalities. As a respiratory virus, its symptoms are similar to the annual flu but more lethal. It also seems to be more transmissible.

Meanwhile, on the economic front, the Federal Reserve has taken a step to stimulate business by lowering the rate it charges to loan money. The idea is to stimulate economic activity and get folks out to spend money. Weird huh? On the one hand, we are told to stay home to avoid the possibility of person to person transmission and at the same time get out in the public and spend to get the stock market value back up.

The Whitehouse proposed a couple of billion dollars to fight the epidemic and the Democrats have proposed much more. Even if approved it is not clear how this money will be allocated. Obviously a vaccine must be at or near the top of the list. Testing equipment and medical supplies from face masks to respirators are needed. Most important is to disrupt person to person contact. Officials have recommended the usual hand washing and if you exhibit symptoms, stay home – don’t go to work or school.

But here our for-profit healthcare system begins to fail us. Health and Human Services Secretary Alex Azar, a former drug company executive and pharmaceutical lobbyist, said that although he would want to make it affordable, he won’t promise that it will be. You hear all the time that related vaccines are “free” but the fine print says “with most insurance.” When the working poor get sick, they don’t stay home. If their children get sick, they go to school. There are a lot of folks whose jobs have no sick leave option – you don’t go to work you don’t get paid.

We need a healthcare system that recognizes it only works if it works for all. Free vaccinations. Full stop, payments to those who shouldn’t be going to work and payments for care of their sick children. And importantly a system that guarantees that they will still have a job if they stay home for an illness.

Dr. Bob Allen, Ph.D., is Emeritus Professor of Chemistry at Arkansas Tech University.

Climate Modeling

Among the many challenges to the dire predictions of global warming and climate change is the questioning of the accuracy of computer models that predict how bad it will get and when will it get there. The short answer is the models are good, not just good but very good. If we look back fifty years when computers were in their infancy and the models very crude we see a considerable congruency between what was predicted and what is happening.

Predictions about global warming are not new by any measure. As early as the beginning of the 19th century, over 200 years ago, scientists recognized that the atmosphere may be capable of trapping heat. Probably most important in the history of global warming and climate change is the work of Svante Arrhenius, a Swedish chemist. He was awarded the Nobel Prize in Chemistry for 1903 for his work in understanding certain features of chemical reactions.

Less well known at the time was his work examining the impact of Carbon Dioxide in the atmosphere on the climate. In 1895, Arrhenius presented a paper to the Stockholm Physical Society titled, “On the Influence of Carbonic Acid in the Air upon the Temperature of the Ground.” He mathematically modeled the impact of varying amounts Carbon Dioxide and water vapor in the atmosphere using only pencil, paper and a slide rule.

Climate modeling with computers in the 1970s vastly increased the predictive power but the computer models are only good as the assumptions going into the models. The modeling done and predictions made look good “in the rear view mirror.” There were over a dozen different models and some overestimated warming and some underestimated warming but over all they were surprisingly accurate.

The models erred due to unforeseen changes in the variables . As time goes on however, the unforeseen decreases with better understanding. One example is the NASA model by James Hansen that overestimated the heating. It was due to an unanticipated reduction of Chlorofluorocarbons in the atmosphere. This decrease came about due to an international effort to deal with the unrelated environmental issue of the Ozone hole.

The computer models calculate the heat input from the sun and output via radiation. Among the variables that impact these calculations are the amount of water in the atmosphere and whether it is in the form of vapor which warms the air, or clouds which reflect the sunlight, creating a cooling effect. The albedo of the planet, that is the reflectivity, is important and varies between land and and sea, and winter and summer due to snow and ice. The temperature of the oceans impacts how much of the greenhouse gases will be absorbed from the atmosphere because the solubility of gases in water is temperature dependent.

Climate modeling gets better by the day. There is no conceivable reason for the world’s scientists to act in concert to defraud the public. That is just silly. It does make sense however for those who profit from pollution to deny the pollution, or try to divert attention from the major culprit – burning fossil fuels.

Dr. Bob Allen, Ph.D., is Emeritus Professor of Chemistry at Arkansas Tech University.

Avian Scavengers

A number of birds will scavenge animal remains of all sorts. The most obvious are vultures but also several birds of prey including eagles, crows and even smaller birds such as roadrunners and Jays. This is an important ecological service to clean up what could otherwise be a source of disease. Actually some of the scavenging birds not specially adapted to the lifestyle can become ill from eating carrion.

Some scavenging birds, including our national symbol the bald eagle, are on occasion poisoned with lead bullet fragments in gut piles left over from field-dressed deer. A couple of lead fragments the size of a grain of rice can be lethal to an eagle.

The most commonly encountered scavengers are the Black and Turkey Vultures frequently seen cleaning up road-killed armadillos, possums, skunks, etc. They share some features and are distinct in others. They are both dark birds with bald heads. The lack of a feathered pate was thought to be for hygiene but recently its been suggested to be for thermoregulation. They also share incredibly acidic stomachs. The gastric juices of the Turkey Vulture are more acidic than a car battery and nearly one hundred times as acidic as our stomachs.

Vultures literally turn down their temperature at night by several degrees. In the morning they can be seen with their wings spread wide to the sun. Another method of temperature regulation involves urohidrosis, defecating down their legs in hot weather. This provides some cooling and may also be beneficial in the management of bacteria on their legs.

If approached by a predator or human for that matter they vomit. Some authors suggest this is a defensive action, as their vomitus is so acidic that it can cause burns. Other authors disagree, but all agree that the loss of the vomitus lightens the body weight to aid take-off for escape. Neither variety of vulture has a syrinx so they can only make grunting or hissing noises.

Black Vultures are slightly smaller than the Turkey Vulture. They have a poor sense of smell, therefore they detect a meal via keen eyesight. They will frequently soar higher than other vultures and follow them to a carcass. Black Vultures can show up in large numbers during the birth of livestock. Not only will they consume the afterbirth but also attack and kill newborn animals, and hence can constitute a serious agricultural pest.

Turkey Vultures are slightly larger and have a distinctive red head. Also, they have a somewhat brownish tinge to their feathers when viewed up close. They have the keenest sense of smell among birds. They tend to soar somewhat closer to the ground sniffing constantly for the odor of a carcass.

The importance of vultures was highlighted in India and Pakistan a decade ago. When the vulture population crashed due to the presence of a toxic (to vultures) arthritis drug in cattle carcasses, stray/feral dogs stepped up to fill the niche as scavengers. The dog population rapidly increased as did rabies. It caused a spike in human deaths due to rabies from dog bites.

Dr. Bob Allen, Ph.D., is Emeritus Professor of Chemistry at Arkansas Tech University.

Feedback Accelerates Global Warming

Positive feedback to global warming keeps climate scientists up at night. Feedback can accelerate the rate of change in the climate. The worst-case scenario involves tipping points where the heating of the air and water make parts of the planet uninhabitable.

A frightening irony concerning global warming is the use of air conditioning to combat the heat. We obviously don’t think about cooling our homes and offices right now, but that is part of our problem. Using air conditioning requires power to generate the necessary electricity. The electricity comes to a large degree from burning fossil fuels which contribute to global warming. This short term reaction to a meteorologic phenomenon contributes to a long term climatological phenomenon.

Climate scientists call this positive feedback. Burning fossil fuels to run air conditioners contributes carbon dioxide to the atmosphere which traps heat and makes the planet warmer, which requires more air conditioning, which causes carbon dioxide release which causes more heating which… You get the picture.

In this case, the feedback has a human element. Other feedback loops are purely physical phenomena. These feedback loops add to the complexity of climate modeling. Predictions of future climate rely on computer calculations, the accuracy of which depends on how well the variables in a climate system are understood.

Water vapor in the atmosphere is without question the most important of the global warming feedback loops. Water vapor is a strong greenhouse gas itself; that is, its presence in the atmosphere traps heat and contributes to global warming. The positive feedback comes about because the amount of water vapor in the atmosphere is directly proportional to the temperature of the atmosphere. The hotter the atmosphere the more water vapor, the more water vapor in the atmosphere, the more heat trapped in the atmosphere, the more water vapor in the atmosphere, the more… you get the picture.

When natural gas (methane) is burned it produces carbon dioxide, a major greenhouse gas. Unburned methane released to the atmosphere is itself a potent greenhouse gas. There are vast stores of methane trapped in the permafrost of the tundra and the continental shelf under the oceans. As the planet warms, the permafrost thaws which releases methane. Methane release warms the planet thawing more permafrost which releases more methane which… you get the picture.

The amount of solar heating of the planet is a function of the albedo, the reflectivity of the planet. Sunlight is strongly reflected by snow-covered expanses near the north and south poles. As the climate warms due to global warming, the snow melts exposing soil which is much less reflective.

The less reflective soil traps more heat, warming the planet further, melting more snow, which traps more heat… you get the picture

The same is true of sea ice and the oceans. Ice is more reflective than water. As the ice cover melts, more heat is trapped, as more heat is trapped more ice melts… you get the picture.

A final irony is that as the area of the oceans covered with ice shrinks, it opens more area to exploration and untimely production of crude oil. Burning these fossil fuels adds more carbon to the atmosphere, which warms the planet… I hope you get the picture.

Dr. Bob Allen is Emeritus Professor of Chemistry, Arkansas Tech University.

Minority Rule is the law of the land

In the lifetime of a current college student, a minority of voters have twice selected the president of the United States. In 2000 Al Gore received about half a million more votes than George Bush but Bush was elected president. In 2016, Hillary Clinton got about three million more votes than Donald Trump yet Trump is the president. This has happened on only three other occasions out of forty-five total times.

Currently the Republican party controls fifty-three votes in the Senate, the Democrats and Independents who caucus with them hold forty-seven. Although the Republicans control the majority of votes in the Senate, they represent only forty-four percent of the voters in the United States. We have minority rule in both the presidency and the US Senate.

This disparity in who decides the law of the land was a result of the “Great Compromise” between the power and influence of the small versus large states. The members of the house of representatives often referred to as the peoples’ house, are elected by popular vote. Each House member, regardless of what state they are from, represents about three-quarters of a million people. The Senate is different. Each state gets two senators regardless of size.

At the time of the writing of the constitution, the difference between the populations of the most and least populous states was not as great as today. The ratio of votes in the most populous state, Virginia, was nineteen times the votes in the least populous state, Georgia. Now, California has nominally seventy times as many voters as Wyoming.

The imbalance of votes in the electoral college follow from the imbalance in the Senate. Each state gets electors equal to the number of representatives and senators. An electoral vote is California is worth only one-fifth that of a vote in Wyoming when population is considered.

Compounding the problem is the fact that most states award electoral votes on a winner take all basis. The states get to decide how to apportion popular vote to electors to the electoral college.

Voters in small states have more “electoral oomph” when it comes to electing the president and the composition of the Senate. We currently have minority rule in the presidency, the Senate and the courts due to the responsibility of the Senate to approve federal judges at all levels. Democracy is only found in the House of Representatives. Elsewhere, the minority is thwarting the will of the majority.

Any remedy is hard to come by. Direct election of the president by popular vote would go along way to alleviate the issue of the electoral college but requires amending the constitution. Some argue that the direct election of the President is impractically complex but we do it in every other jurisdiction in the country.

Fixing the disparity in representation in the Senate is even more difficult. Breaking up the big states into smaller pieces by creating senate districts would work. Likewise combining the smaller states into super senate regions is possible. Neither of these is likely – as in now way Jose.

Dr. Bob Allen is Emeritus Professor of Chemistry, Arkansas Tech University.

There’s a Reason for Regulations

In 1516, Duke Wilhelm IV decreed that beer can have only four ingredients: barley, hops, water, and yeast. This regulation, Reinheitsgebot, persists to this day in Germany. The absence of regulations follows the dictum of caveat emptor, the principle that the buyer is solely responsible for ensuring the quality and suitability of goods to be purchased.

If you are buying apples or oranges, it isn’t all that difficult to tell just what you are buying, but in an increasingly complex society with the range of foods and especially drugs, it isn’t so easy. For decades preceding the turn of the twentieth century, the government struggled with protecting us from harm. The Pure Food and Drug Act was passed in 1906 and with it was the creation of the Food and Drug Administration (FDA.)

About this time agriculture was shifting from subsistence farming to larger corporate operations. Increasingly, processing of foods made them less easily identifiable as pure. The muckraking novel “The Jungle” by Upton Sinclair was published in 1905. The novel highlighted the exploitative and unsanitary conditions in the meatpacking industry.

Since that time our food and drug supply have become much safer due to a myriad of regulations by the FDA and the Agriculture Department. Drugs must pass rigorous testing for purity and efficacy and the United States Department of Agriculture (USDA) has been charged with among other things inspecting the meatpacking industry. But with regulations come costs. The fact that meat from infected carcasses doesn’t end up at the grocery store or toxic and carcinogenic molds don’t contaminate peanuts are the result of regulations.

Ronald Regan once famously said let business be business, implying that regulations were stifling commerce. This began the legitimization of deregulations to streamline government involvement in business. The Trump has carried this torch onward. Warning letters, a key tool to keep dangerous drugs or tainted food off the market have fallen by one-third since Trump took office. The rarer but more strict injunctions have also dropped by close to thirty percent.

Deregulation in the meatpacking industry includes allowing faster processing of carcasses and a reduction of the number of USDA inspectors. In some plants, the USDA inspectors have been replaced entirely by plant employees. Do you really think that a meatpacker whose business is to profit from packing meat will view an infected or contaminated carcass with the same critical eye as an inspector paid by the USDA?

The deregulatory zeal goes far beyond food and drugs. The laws and regulations which protect our air and water are under assault. Overturning the Obama era Clean Power Plan means there will be more Mercury in our water, and more Ozone in our air just to name two.

Other actions include allowing greater occupational exposure to toxic substances, and an increase in allowed “accidental release” of toxic materials. Regulations protect us from harm. Business may not like them, and they may cause a small increase in the costs of goods and services but overall they make our society safer. It’s called civilization.

Dr. Bob Allen is Emeritus Professor of Chemistry, Arkansas Tech University.

Ocean Woes

Threats to the biosphere from changes in the oceans are real. Global warming involves not just atmospheric heating but also sea surface warming. About half the increased warming is going to the oceans. This can have wide-ranging effects, with deoxygenation at or near the top of the list of risks.

Henry’s law states that the solubility of gasses in water is inversely proportional to temperature. What this means is that warmer water holds less oxygen. Anglers in Arkansas recognize three distinct kinds of conditions for fishing. Likely the most common fishery in Arkansas is a lake where the water temperature and hence the oxygen content supports fish such as largemouth Bass, sunfish, and the like.

If you are after smallmouth bass you are unlikely to find them in a lake, at least here in Arkansas. Smallmouth bass require a higher oxygen content that is available only in cooler water – usually clear streams that flow fast enough to avoid warming from the sun. It is not uncommon to see smallmouth bass at the cooler upstream ends of creeks and largemouth at the lower, warmer reaches.

Trout are the most demanding in terms of oxygen needs. Trout only thrive in cold water with the highest oxygen concentration. Here is Arkansas that means creeks that get the majority of their flow from springs and the cold tailwaters of impoundments.

The point of this freshwater digression is to point out that the variety and number of fish in a given locale is dependent on water temperature. This is also true in the oceans. There is a reason that megafauna such as whales spend their time in the cold, oxygen-rich waters of the arctic and Antarctic regions – that’s where their food is found in abundance. As the surface of the oceans warm, we should expect changes in where fish and sea mammals alike can survive. Just that sort of change is happening and it doesn’t look good.

Cod are an extremely important commercial fish found in northern regions of both the Atlantic and Pacific Oceans. The importance of this fish alone can not be overemphasized. The coastal regions of northern Europe have depended to a large degree on access to Cod. In the middle of the twentieth century the United Kingdom and Iceland were all but at war over fishing rights to the cod in the north Atlantic near Europe.

The trouble with cod now centers in the North Pacific. Just last week, the Gulf of Alaska was closed to cod fishing for the 2020 season. Stocks have been declining for several years, not from overfishing as occurred in the Grand Banks region of the Atlantic, but from ocean warming. The Arctic is warming much faster than the rest of the planet. Glaciers are receding, arctic ice is diminishing and now fish stocks are dwindling.

In the future, it is conceivable that other more tolerant species of fish can migrate into the warming Arctic waters but for other locales, this isn’t possible. Fish currently in the tropics are already the only species tolerant of the lower oxygen concentrations. Higher temperatures will likely create fish “deserts.”

Dr. Bob Allen is Emeritus Professor of Chemistry, Arkansas Tech University

wind turbine

Size Matters/Wind Turbines

Utilization of the wind for motive power has a long and rich history. Wind-powered sailing vessels were known to ply the Nile river somewhere between 3 and 5 thousand years before the common era (BCE.) Although there is no direct evidence, it is quite possible that sailing craft could have been employed 50 thousand years ago to populate Southeast Asia and Australia.

Stationary power production in the form of lifting water has been dated to a few centuries BCE. Similarly, the wind was used for motive power to grind grain. The use of wind turbines in the Netherlands is legendary. By the 14th century CE, the Dutch were making extensive use of wind turbines to pump water out of the Rhine river basin to recover and maintain dry land. There is a reason this part of Europe is referred to as the “Low Countries.”

The history of wind for the generation of electrical energy is of course much younger. In 1877 Professor James Blythe in Glasgow, Scotland erected a 10-meter tall cloth-sailed wind turbine connected to batteries to light his cottage. Small scale isolated wind-powered electrical production has been in use around the world, including early twentieth-century Midwestern United States. Centralized power delivered via rural electrification in the 30s replaced virtually all small systems.

The modern era of electrical power production began in the 70s following the formation of the Organization of Oil Exporting Countries (OPEC) and subsequent oil price shocks and embargoes. The price of crude oil skyrocketed and shortages of gasoline forced rationing. Later years saw the federal government subsidize wind power with grants and production credits. In 1990 less than one percent of total electrical energy in the United States came from wind. Currently is over seven percent.

The real change in wind power is the size of the turbines themselves. The earliest modern turbines averaged 50 kW, enough to power only a handful of homes. Also, these early turbines were erected on derricks which made for attractive roosting sites birds, especially raptors which led to unacceptable bird kills. The development of monocoque supporting towers have greatly reduced but not eliminated bird kills.

By the start of the twenty-first century, the average turbine size increased 30 fold. These giants produce about 2 MW. Simple calculations show that the midwestern United States could easily produce all the electrical needs of the country except for the distribution problem – most Americans live near the coasts far from the windy central United States.

The real expansion of wind power will occur with off-shore installations. Most off-shore wind is now located in shallow near-coastal areas, but plans for real behemoths on floating towers are in the works. Each of these 20+ MW plants, taller than the Eiffel Tower, can provide energy for tens of thousands of homes.

Both wind energy production and potential continue to grow. The cost of energy production continues to drop and with the advent of large off-shore plants comes more reliability and less intermittency.

A Natural Reaction

Uranium is, of course, the stuff of nuclear reactors and atomic weapons, but it is also part of an intriguing detective story from 1972 that traces back to events two billion years ago – actually 4.5 billion years but at that age who’s counting.

First a little background. For either nuclear reactors or bombs, Uranium 235 is required. This isotope of Uranium has fewer neutrons in the nucleus and is present in small concentrations with U238, the most common isotope. U235 with a natural concentration of 0.72 %, must be concentrated further to make a fissile material that is used in reactors and weapons.

In the early seventies, there was something of a panic in France. France, then as now provides the lion’s share of their electricity from nuclear reactors. At the time France was buying Uranium ore called yellow cake from a mining region in the Oklo River basin in Gabon, Africa. Assays of some shipments showed that the ore was unnaturally low in U235, sometimes by as much as half the expected concentration.

During this period there was much civil unrest as the continent slowly emerged from under the yoke of colonialism. It was feared that Uranium was being stolen by a local tribe with the intention of making a crude bomb. It turns out that the problem was a rather unnatural event in nature. When scientists looked at an analysis of the shipments low in U235 they found several unnatural elements such as Americium, Curium, and Polonium.

These so-called transuranic elements were not known to exist in nature until this discovery. The only place they had been observed was as part of the waste from nuclear reactions, both controlled reactors and bombs. The French had discovered an extremely rare event, a natural nuclear reactor.

When the U235 atoms draw too close together a chain reaction occurs which produces heat. That heat is used to produce steam in nuclear reactors. In the process, the U235 reacts to turn in to other elements. Exactly the same process occurred in the Oklo River basin.

Over two billion years ago there was scant free Oxygen in the air, then along came cyanobacteria. Gradually the atmosphere changed and many minerals reacted with Oxygen. All the rusty looking soil across the planet is due to Iron Oxide which formed during this period.

In the case of Uranium, it became more water-soluble as it oxidized. In locations with rich Uranium deposits such as the Oklo River basin, this allowed for the dissolved Uranium to accumulate in shallow lakes. Over time some of these lakes became isolated and as the lakes evaporated the Uranium was concentrated. Another bacteria capable of taking the Oxygen away from the Uranium Oxide reduced the solubility even further.

When the Uranium in these pools reached critical mass – the concentration necessary for a chain reaction – the U235 fissioned producing heat and forming the transuranic elements. As the reactions proceeded the U235 was depleted. Altogether sixteen different sites in the river basin have been found to have undergone fission reactions. To date this is the only known place on earth where a natural fission reaction has occurred.

Dr. Bob Allen is Emeritus Professor of Chemistry, Arkansas Tech University.