HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » NNadir » Journal
Page: « Prev 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Next »


Profile Information

Gender: Male
Current location: New Jersey
Member since: 2002
Number of posts: 25,707

Journal Archives

Sure. I'd be very pleased to do so. A look at the Danish data for wind turbines is also...

...instructive though.

One needs to do a little Excel manipulation to do it, and be able to compare at utilize numbers, to see that this useless crap on average becomes landfill, on average, in about 15 years, meaning as with the case of the atmosphere, future generations will be screwed by our environmental wishful thinking.

The Danish Excel spreadsheets are here: Master Table of the Performance of Danish Wind Turbines

The much ballyhooed Danish wind program which inspired a lot of stupidity around the world, after strewing thousands upon thousands of leaky crap whirlygigs that depend on the disastrous lanthanide mines of Baotou, China, doesn't produce as much energy as a single nuclear plant built 30 years ago can produce.

Analysis of this data convinced me that my lack of hostility to the wind industry - which I may have held ten years ago - was inappropriate. The other thing that convinced me was the fact that we squandered a trillion bucks on this garbage in the last ten years alone (found here): Global "Investment" in so called "Renewable Energy".

A glance at the Mauna Loa carbon dioxide observatory might show what the environmental result of this massive squandering has done for climate change: Carbon Dioxide Trends at Mauna Loa

If one were to spend as much time with this data as I have over the last 20-30 years, one could see that a rate of 2.3 ppm per year, observed over the last ten years is the highest rate of decomposition of the atmosphere ever observed, meaning all the money squandered on so called "renewable energy" hasn't done shit even to slow the second derivative of the atmosphere's collapse.

Now, none of this is "peer reviewed;" it merely requires independent critical thinking to utilize.

This is also true of "peer reviewed" papers, with which I've spent the last 30 years reading about energy and the environment; it requires critical thinking as well.

This said, I'm happy to supply some links, some of which say that the environmental impact of the wind industry is inadequately evaluated, others suggesting what that impact is; and no, it's not zero, even ignoring the fact that the wind industry's main achievement is to increase rather than decrease reliance on unsustainable and frankly criminal dependence on dangerous natural gas.

Here's a nice current papers in the primary scientific literature which I have collected the full text from my files.

From very recent Chinese analysis of the wind portion of so called "renewable energy" (China faces the most health effects from the so called "renewable energy" scam:

Approach to Evaluate the Reliability of Offshore Wind Power Plants Considering Environmental Impact

Life cycle assessment and net energy analysis of offshore wind power systems (This one includes analysis of steel and concrete impacts, but is very weak on biological impacts.)

Bird Killer, Industrial Intruder or Clean Energy? Perceiving Risks to Ecosystem Services Due to an Offshore Wind Farm

Here's a whole book focusing on the biological implications on this stupid enthusiasm for this absurd and stupid scheme to fill the ocean and land with giant greasy tubines:

Wind Energy and Wildlife Interactions It was published just this year and it contains lots and lots and lots of "peer reviewed" references for anyone that's interested in the point. The text on the impact of German wind farms on the endangered red kites is illustrative.

Germany hosts more than 50% of the global breeding population of Red Kites (BirdLife International 2015), and hence should be responsible for protecting this species. A preliminary analysis (Mammen et al. 2013) showed that many adult Red Kites (older than two years) in Germany were killed by colliding with wind turbines (57 out of 63 cases). However, one and two-year-old Red Kites seemed less affected. Many of these collisions took place during the breeding season and caused both the loss of a partner and the loss of the brood. Bellebaum et al. (2013) modelled the numbers of wind turbine collisions of Red Kites in Brandenburg, Germany, and found that collisions were responsible for a 3.1% decline in the local post breeding population. They state that the mortality of Red Kites due to wind farms is approaching critical thresholds with respect to population growth.

Fuck the Kites; Wind Power is sexxxxxxxxxxyyyyyyyy.

One of the most moving references therein, is this one, a scientific paper written as a "plea." The catchment area of wind farms for European bats: A plea for international regulations

Fuck the bats; Wind Power is sexxxxxxxxxxyyyyyyyy.

This paper is open accessed:

Wind Farm Facilities in Germany Kill Noctule Bats from Near and Far

Turning to the marine area, on which tens of thousands of papers have been written, many in recent years, these scientists, mostly Marine Biologists complain that nobody has a clue about the effect of all this offshore development will have on the benthos, specifically those creatures that live on the sea floor, you know, like mussels: Turning off the DRIP (‘Data-rich, information-poor’) – rationalising monitoring with a focus on marine renewable energy developments and the benthos

Even though we have no idea at all about the effect of the useless and ineffective wind industry (at least with respect to climate change), there's lots of cheering here for a pop news article about mussels.

It's a disgrace.

By the way, I've spent the last 30 years using much of my free time reading the primary scientific literature about energy and the environment. When Carbonite reports on the number of files on my computer it has backed up, the number is usually over 600,000 files. I would guess that at least 50-60% of the papers in my files relate to energy and the environment, with a large portion of that devoted to climate change. This includes an extensive number of papers related to the world's safest, and most sustainable form of energy, nuclear energy, which the neither the wind industry or the solar industry can match for low environmental and human impact.

At the time I started this kind of time intensive obsession, right after Chernobyl blew up, I was, a fan of the wind industry and the solar industry, up to even ten years ago. I was, in 1986, much to my personal disgrace, a critic and opponent of the nuclear industry.

I've changed my mind.

It's not like I did so with inattention. Quite to the contrary, I've invested lots and lots and lots of time, and if nothing else, my opinions, my strong opinions are informed.

If you want "peer reviewed" stuff, one need not ask for it from a blogger on the internet. You can get it yourself. Google Scholar is your friend.

My conclusion after all this work is this: The wind industry and the solar industry are useless, and they are dangerous. I am embarrassed by the rote enthusiasm for these schemes on my end of the political spectrum, the left, the environmental left. The number of recs this vague, and frankly misleading news item generated here is troubling to say the least.

The solar industry in particular, which after half a century of wild cheering can't even produce 2 of the 570 exajoules of energy generated and consumed by humanity is nothing more than a modern day asbestos, asbestos having been a "wonder material" generating wide enthusiasm in the mid 20th century, only to be a bane for our generation to clean up - if we dare to clean it up.

The solar industry, which has already participated in causing 10% of the rice crop in Southern China to land above generally acceptable levels of safe cadmium, will prove for future generations, far more baleful, this after we also dumped trillions of tons of carbon dioxide into the atmosphere in which they must live. We assume, rather criminally, that they will do what we ourselves were incompetent to do; this is the real meaning of all this "by 2050, or by 2080, or by 2100" crap put out by assholes like the poorly educated bourgeois brats at say, Greenpeace. It's not going to happen; they are likely to live in a long running disaster movie and will lack the resources to help themselves. The planet will be a giant Puerto Rico, circa 2017.

Future generations may not forgive us; personally, I don't think they should. The lies we told ourselves will bare no expiation of our place in history, which may be recorded as the "worst generation. Ever."

Have a nice Sunday evening.

Trispecific Antibody to HIV Developed; May Represent A Real Cure for AIDS.

HIV, as many people know, is caused by a "retrovirus," that is a virus that does not contain DNA but rather RNA which is "reverse transcribed" into DNA in infected cells, whereupon the DNA activates machinery in order to construct new viral particles which ultimately rupture the cells, releasing more viruses.

The HIV virus thus offers two opportunities for transcription error: During transcription to DNA, and during formation of RNA for new viral particles. Moreover the HIV viral machinery contains no transcription error correcting mechanism. Mutations in HIV thus arise 10X faster than with DNA viruses.

It was my privilege to work, albeit in a peripheral sense, on the first several of the second class of anti-HIV drugs, the protease inhibitors in the mid 1990's; the first class was reverse transcriptase inhibitors like AZT. This class of drugs had a real impact on the disease; the survival of people like, for one case, Magic Johnson, is a testimony to their success.

The HIV protease cleaves viral zymogens, zymogens being proteins that are inactive until a part of each molecule cleaved by the protease, in this case an "aspartyl" protease that cleaves the zymogens at an aspartic acid residue, thus activating the viral proteins reverse transcriptase, integrase, and more of itself, the protease. Without this cleavage the viral proteins are inactive and the virus cannot function as a virus; it is inactivated, but not destroyed.

However, because of the rapidity of mutations in the virus, over ten billion viral particles are produced each day in an infected person with active AIDS, with a new generation of viruses being produced every 2.4 days at a rate of 140 generations per year, resistant strains of the virus can and do arise rapidly.

For the first generation of proteases developed in the 1990's, resistant strains had appeared for all of them by the year 2000.

The amino acid substitutions for the mutant strains to these drugs are listed here, where the letters refer to the codes for specific amino acids, and the numbers refer to the position in the HIV protease:

D30N: Nelfinavir. (Agouron/Pfizer).
M46I/I47V/I50V: Amprenavir (BMS).
L10R/M46I/L63P/V82T/I84V: Indinavir (Merck)
M46I/L63P/A71V/V82F/I84V: Ritinovir (Abbott).
Saquinavir: G48V/L90M (Roche)

The companies in the parentheses are the companies that developed these drugs.

(cf: Protein Science (2000) 9: 1898-1904)

Modern treatment for AIDS is not really curative; it is palliative and relies on a drug cocktail, a reverse transcriptase inhibitor (the class containing AZT), a protease inhibitor, and a third class, a CCR5 inhibitor known as a fusion inhibitor. It is hoped (and happily often observed) that the combination is effective, if expensive, with a failure to observe the regimen actually promoting the generation of resistant strains. (This is also true of other anti-infectives, such as antibiotics; however with antimicrobials such as antibiotics, the rate of evolution of resistant strains is slower.)

These drugs do not kill the virus; they inactivate it or in some (problematic) cases, slow its replication without actually halting it.

It is thus with some excitement that I came across this paper in the most recent issue of Science: Trispecific broadly neutralizing HIV antibodies mediate potent SHIV protection in macaques

(Xu, Pegu et al, Science 10.1126/science.aan8630: Final Page Numbers Not Yet Assigned) The paper was published by a consortium of scientists from the pharmaceutical company Sanofi, and a team of academic and government institutions, the latter type of institutions being under attack by the orange ignoramus in the White House and his fellow science hating enablers in Congress and his cabinet.

"Trispecific" means that the antibody has multiple "CDR's" or "Complementarity Determining Regions" designed to bind to different areas on cells. Antibodies are, of course, Y shaped proteins that mediate immune responses, and the CDR's are small sequences of amino acids in these proteins that recognize foreign or diseased cells and attach themselves to them result in their destruction or inactivation. Most antibodies are monospecific, designed to attack a single region of display on the foreign body. This protein, by contrast has been designed to simultaneous attack any of three different regions involved in HIV pathology; by doing so it reduces the avenues by which this wily virus can escape destruction and thrive.

From the introductory text of the paper:

A variety of broadly neutralizing antibodies (bnAbs) have been isolated from HIV-1 infected individuals (1–3), but their potential to treat or prevent infection in humans may be limited by the potency or breadth of viruses neutralized (4, 5). The targets of these antibodies have been defined based on an understanding of the HIV-1 envelope structure (6–9). While bnAbs occur in selected HIV-1 infected individ-uals, usually after several years of infection, it remains a challenge to elicit them by vaccination because broad and potent HIV-1 neutralization often requires unusual antibody characteristics, such as long hypervariable loops, interaction with glycans, as well as a substantial level of somatic mutation. Strategies have thus shifted from active to passive im-munization to both protect against infection and to target latent virus (10–14). We and others have begun to explore combinations of bnAbs that optimize potency and breadth of protection, thus reducing the likelihood of resistance and viral escape (15–17). Antibodies directed to the CD4bs, MPER, and variable region glycans are among the combinations that so far provide optimal neutralization (18). In ad-dition, alternative combinations have also been investigated for the immunotherapy of AIDS, by directing T lymphocytes to activate latent viral gene expression and enhance lysis of virally-infected cells (19, 20). Given that multiple antibodies may help to reduce the viral replication that sustains chronic HIV-1 infection, we report here the generation of multi-specific antibodies designed to increasing the efficacy of HIV therapy.

Although individual anti-HIV-1 bnAbs can neutralize naturally occurring viral isolates with high potency, the per-centage of strains inhibited by these mAbs varies (21, 22). In addition, resistant viruses can be found in the same patients from whom bnAbs were isolated, suggesting that immune pressure against a single epitope may not optimally protect or treat HIV-1 infection. We hypothesized that the breadth and potency of HIV-1 neutralization by a single antibody could be increased by combining the specificities against different epitopes into a single molecule.

Glycans are sugar signaling molecules bound to proteins. (They are very challenging molecules with which to work, although spectacular advances in the their characterization are under way.) "Epitope" is the sequence of amino acids that defines the CDR.

Some technical text relating to the design of the antibodies:

To achieve our goal, we used a previously undescribed trispecific Ab format. Three specificities were combined by using knob-in-hole heterodimerization (24) to pair a single arm derived from a normal immunoglobulin (IgG) with a double-arm generated in the CODV-Ig. A panel of bnAbs was evaluated, including those directed against the CD4bs that included VRC01 and N6, as well as PGT121, PGDM1400 and 10E8 (fig. S1). A modified version of the latter, termed 10E8v4, was used because of its greater solubility (25). We first determined which bispecific arms showed the best potency, breadth and yield. This screening analysis revealed that combinations which contained PGDM1400, CD4bs, and 10E8v4 showed the highest level of production and greatest potency of neutralization (fig. S2).

We then evaluated different combinations of single arm and double arm specificities from PGDM1400, CD4bs, and 10E8v4 Abs for their expression levels and activity against a small panel of viruses (fig. S3), leading ultimately to the identification of trispecific antibodies VRC01/PGDM1400-10E8v4 and N6/PGDM1400-10E8v4 as lead candidates. When analyzed against a panel of 208 viruses (18) and com-pared to the parental antibodies alone, the highest neutrali-zation potency and breadth was observed with N6/PGDM1400-10E8v4, with only 1 of the 208 viruses showing neutralization resistance...

The molecule was able to prevent AIDS infections in a model animal, macques. This said, a molecule of this design has not been tested in humans, although human volunteers tolerated a bispecific analogue quite well. It is not known if the antibodies will not generate ADA's or "antidrug antibodies" which are antibodies against antibodies. This risk is always associated with protein drugs, despite their broad success in treating disease and saving lives.

The authors comment thusly:

While further human trials are needed to assess the full potential of the trispecific Ab platform, the data from the NHP challenge study described here, as well as the pre-vious experience in humans with bispecific Abs (44), sug-gests that the approach merits further clinical investigation. Studies in HIV-infected subjects, alone or in combination with other immune interventions, will address the potential of trispecific Abs to provide durable protective immunity against infection or sustained viral control in HIV infected subjects during drug holidays or in the absence of antiretro-viral therapy.

The experimental details of the project are described in the supplementary information, which is apparently open sourced and is here: Supplementary Information

Here one may learn that the technology making this work possible is genetic engineering.

To wit:

Trispecific antibodies were produced by transient transfection of 4 expression plasmids into Expi293 cells using ExpiFectamine™ 293 Transfection Kit (Thermo Fisher Scientific) according to manufacturer’s protocol. Briefly, 25% (w/w) of each plasmid was diluted into Opti-MEM, mixed with pre-diluted ExpiFectamine reagent for 20-30 minutes at room temperature (RT), and added into Expi293 cells (2.5x106 cells/ml). An optimization of transfection to determine the best ratio of plasmids was often used to produce the trispecific antibody with good yield and purity. 4-5 days after transfection, the supernatant from transfected cells was collected and filtered through 0.45 µm filter unit (Nalgene). The trispecific antibody in the supernatant was purified using a 3-step procedure…

All protein drugs are, in fact, GMO, and if you have a politically motivated hatred of genetic engineering and all things GMO because you get your "science" from reading Greenpeace pamphlets, Greenpeace being an organization that hates science with the same intensity as say, Republicans, these kinds of drugs are not for you.

Nevertheless, this is exciting and encouraging work.

Enjoy the weekend.

"Smart Bricks" for Measuring the State of Gasifier Walls.

It is becoming increasingly clear that all efforts to address climate change have failed, and it will fall to future generations - assuming that we have not permanently impoverished them and destroyed their futures with our bad thinking on energy and the environment, left and right - to clean up the mess with which we've left them.

In desultory reading on a night of reflection on my life where I feel the guilt of my generation, the history bad ideas in energy flows before me like a suddenly honest sinner's nightmare, but even as I recognize our crime against the future, I am forced to confess that not every bad idea is totally without merit if at least something valuable can be extracted from it.

One of the worst ideas in energy - one that has actually seen, happily only in a limited number of places, industrial application - was a pet project of Jimmy Carter, he of the bad ideas in energy. Briefly, at least while a Presidential primary candidate, though thankfully not when he actually became President, this bad idea was also endorsed, in theory at least, by Barack Obama. The bad idea to which I refer is, of course, coal gasification, technically known as "reformation" to make synthetic petroleum, also known as Fischer Tropsch fuels. (I opposed Obama in the 2008 primaries based on this position; happily he proved me wrong, his policies were much superior to his rhetoric in this case.)

In coal gasification, the idea is to heat coal to very high temperatures under pressure in the presence of steam - actually not steam but supercritical water - to make a mixture of hydrogen and carbon oxides known colloquially as "Syn gas." If the heat from this process is generated by burning coal and dumping the waste indiscriminately into the air, this technology would at best double, at worst more than triple, the climate change impact of petroleum, said climate impact already being entirely unacceptable.

Because we are not smart enough, or honest enough to even stop using fossil fuels, choosing to address them instead with worthless pablum about a grand "renewable energy" future that did not come, is not here, and will not come, the engineering challenge for future generations will dwarf ours, since they will need not only to ban fossil fuels, but will also remove the hundreds of billions of tons of dangerous fossil fuel waste, carbon dioxide, from the atmosphere, where it has been accumulating at a rate of over 30 billion tons per year, a rate which is rising, not falling, mostly because of the runaway popularity of the dangerous fossil fuel natural gas for which so called “renewable energy” is nothing more than a fig leaf. (Dangerous natural gas is not clean; it is not safe, and in spite of tiresome and obviously untrue nonsense put out by purveyors of the grotesquely failed and ridiculously expensive so called “renewable energy” scam, it is not “transitional.”)

Sophisticated arguments have been made that the thermodynamic and thus the related economic engineering challenges of removing carbon dioxide from the atmosphere make it next to impossible. A widely discussed paper on this topic is here:

Economic and energetic analysis of capturing CO2 from ambient air (House et al , PNAS.108 51 20428-20433.(2011)]

A number of arguments questioning this assumption have been advanced and, in the very same journal where the House paper was published, a team of scientists at Columbia has argued in an overview paper that House's paper better not be the last word because removing the dangerous fossil fuel waste carbon dioxide is an urgent matter: The urgency of the development of CO2 capture from ambient air (Lackner et al, PNAS 109 13156–13162 (2012)) Of course, this paper was published, as of this writing, almost 5 years ago, so whatever “urgency” there is about climate change, it’s been totally ignored, which is not to say it's really not urgent, only that it's becoming more urgent.

For a nice review of chemical air capture strategies see: Direct Capture of CO2 from Ambient Air (Jones et al Chem. Rev., 2016, 116 (19), pp 11840–11876) (I've attended lectures by the primary author of this review, Chris Jones, of Georgia Tech, at scientific meetings; I'm impressed by his work.)

From my perspective, air capture should be an achievable goal for human beings in a generation less stupid and selfish than ours. I say this only because clearly plants do this (albeit surprisingly inefficiently in energy to mass ratio terms) and therefore, just as generations of human beings living before the Wright brothers recognized that heavier than air flight had to be possible, since birds and insects existed. House's paper explicitly states that biological strategies for removing carbon dioxide are not covered.

There is, of course, an option that simultaneously exploits both biological and physicochemical options for removing carbon dioxide from the atmosphere, and utilizes chemistry that I evoked at the outset of this post, reformation, not of coal, but of biomass. The combustion of biomass has been, or course, practiced for millennia, and it is still practiced widely today; but as practiced it is very dangerous, dangerous biomass waste is responsible for about half of the seven million air pollution deaths that take place each year as of 2017, even as awful poorly educated dullards carry on about so called “nuclear waste,” which has killed no one in more a half a century of accumulation, and which is in actuality a valuable resource that future generations may appreciate more than most people in this entirely easily distracted generation are competent to understand.

No matter.

My hostility to so called “renewable energy” should be familiar to anyone familiar with my writings here and elsewhere, and biomass is often defined as “renewable energy” but, this said, I believe, as I do for so called “nuclear waste,” that biomass waste has great potential as a resource, most notably for the removal of carbon dioxide from the atmosphere, but also for the recovery of other critical materials, the most important of which is phosphorous. (World supplies of mineable phosphorous – on which the world’s food supply currently depends – are very much subject to depletion.)

While “renewable” biofuels like ethanol have represented a tremendous environmental tragedy in the United States, (you know, the road to hell…) resulting in the destruction of the Mississippi Delta ecosystem, for example, it happens that there is another approach to biomass utilization that is likely to prove far more benign than fermentation and distillation, and to the extent it is one of the few options capable of actually removing carbon dioxide from the air, deserves consideration. This is the thermal reformation of biomass, where biomass substitutes for the coal based scheme that Jimmy Carter proposed, and which frankly, we should all be grateful, never made it to big time in the United States, the world’s most egregious consumer nation.

If the heat for driving this largely endothermic reaction is nuclear heat, the process is almost certain to be unambiguously carbon negative, particular in the case where the carbon collected is utilized in products like polymers, carbon fibers, carbon nanotubes, refractory metal carbides (which would be necessary for nuclear heat at high enough temperatures to drive reformation reactions and thermochemical water splitting reactions), silicon carbides and extremely useful and exciting graphene, modified graphene and carbon nitrides. All of these products sequester carbon, and do so in an economically viable way, a “waste to products” way.

But there’s a problem. Biomass is not pure carbon, hydrogen, oxygen and nitrogen, of course: It also contains a considerable fraction of metals. The most problematic of these are the alkali metals, in particular potassium and sodium, and to a far lesser extent, lithium and rubidium.

Consider potassium.

A nice paper, the residue of Chinese grammar in the translation aside, which was recently released as a corrected proof discusses the case quite well: Transformation and release of potassium during fixed-bed pyrolysis of biomass (Lei Deng Jiaming Ye Xi Jin Defu Che, Journal of the Energy Institute, Corrected Proof, Accessed 9/19/17)

An excerpt:

In China, a few biomass-fired boilers have been successively built and operated during last ten years [8], and grate firing is still the most widely used firing method. However, the grate-fired boiler has been experiencing serious problems of fouling, slagging and high-temperature corrosion according to the foreign experiences [6,9e12]. In China, although the operational time of the grate-fired boiler was relatively short, the tube bursting of superheaters began to occur due to the deposit-induced high-temperature corrosion [13e16]. These problems appeared in biomass-fired boilers have been considered to originate from the release of K, Cl and S during combustion of biomass.

The occurrences of ash deposition and high-temperature corrosion on superheaters have experienced three processes. First, parts of K, Cl and S go into the gas phase to form HCl, Cl2, SO2, SO3, KOH, KCl or K2SO4 during combustion of biomass [7,17e19]. Second, gaseous potassium salts condense in the gas phase and on the surface of superheater to form sticky particles and condensed layer, respectively. Then the ash deposition occurs when fly ash particles are trapped by the sticky condensed layer [9e11,16]. Finally, the ash deposit (mainly composed of KCl and K2SO4) and metallic matrix react with HCl, Cl2, SO2 or SO3, which would cause the growth of ash deposit and high-temperature corrosion [9,12,20e23]. Obviously, potassium is involved in all three processes and plays a crucial role. Compared with coal, biomass generally has much higher potassium content [5,24]. Although pyrolysis is different from combustion with regard to the surrounding environment and temperature fields, it is still meaningful to investigate the transformation and release of potassium during biomass pyrolysis, because pyrolysis happens at the primary stage of combustion. The investigation will be significantly helpful to understand the origin of ash deposition and high-temperature corrosion occurred on superheaters and find methods to solve these problems. It can also be useful to the design of biomass-fired boilers or other thermal conversion equipment.

A form of energy technology which requires constant replacement of infrastructure is neither sustainable nor environmentally benign, simply from a materials utilization standpoint, since the preparation of materials is generally energy intensive. (This is a big problem with another example of the failed expensive so called "renewable energy" industry, the wind industry, where Danish database of turbines shows that the piece of crap turbines don't last an average of 16 years before needing replacement.)

I personally believe that the materials science issues involved of high pressure reformers is one that can be solved; however the question stands unequivocally before us that we are out of time, that anything we may or may not do to address climate change is already too late. It may be desirable therefore to build less than optimized biomass reformers, at least as a stopgap measure, until engineers and scientists can optimize materials to be more sustainable. We must have technologies that not only prevent the dumping of dangerous fossil fuel waste into our favorite waste dump, the planetary atmosphere, but also remediate the waste dump itself: Our atmosphere is a "superfund" site, and we must find a way to clean it.

It is therefore with interest that I read a recently published paper that purports to have developed a technology that can at least measure the performance of materials in high temperature reformers continuously, during operation.

The paper is here: Estimations of Gasifier Wall Temperature and Extent of Slag Penetration Using a Refractory Brick with Embedded Sensors (Debangsu Bhattacharyya, et alInd. Eng. Chem. Res., 2017, 56 (35), pp 9858–9867)

Some text from the paper:

Integrated gasification combined cycle (IGCC) technology is a promising technology for producing electricity from fossil fuels and biomass with high efficiency. This technology offers superior environmental performance and high energy efficiency. 1,2 The gasifier is the heart of IGCC plant.3−5 The operating temperature of a gasifier is one of the key variables.6,7 Lower temperature can lead to lower carbon conversion and increase the viscosity of slag, eventually leading to disruption in the slag flow.8 Higher operating temperature can improve carbon conversion but reduces the life of the refractory.9,10 The lifetime of the refractory lining of the gasifier is a major concern for both cost11 and availability. If the gasifier is operated at the optimum temperature, the lifetime of the high chromia refractory can be prolonged to almost 2 years.10,12 Therefore, it is desired that the operating temperature of a gasifier is strictly monitored and controlled.

"Almost two years..." That's even worse than wind turbines, and wind turbines suck. Moreover, the liner they're describing is chromia. Chromium is not an environmentally benign element with which to work. (To be fair there are many other refractory oxides, carbides and nitrides with which one can envision accomplishing the same task, one of the most important of these is zirconia, ZrO2, and of course nitrides, like, say, thorium nitride.

The authors go on in the paper to describe a type of brick with an embedded sensor. The brick is alumina and in it is embedded a thermoresistor made of tungsten carbide in an alumina matrix.

The sensor is arranged so as to give an interdigital capacitor, a set of capacitors in series that measure changes in temperature via changes in the diaelectric constant of the system and thermal expansion resulting in changes in distance between the capacitor as well as changes in the diaelectric constant (presumably from a base line) owing to the intrusion of slag elements.

Some remarks from the conclusion:

In this paper, a rigorous, first-principles, dynamic model of the smart refractory brick has been developed. The thermal model for multilayer gasifier wall has been developed by considering properties of the pristine smart refractory brick as well as that of the slag-infiltrated brick. Models of the interdigitated capacitor as well as the thermistor have been developed by considering the installation direction and the geometries of the embedded sensors. Using the TKF, both the thermistor and IDC sensors are found to provide satisfactory estimates of the temperature profile for pristine and slag-infiltrated bricks despite high model mismatch. Our results show that satisfactory estimation of temperature profile can be obtained even for locations where there is no sensor by utilizing measurements from sensors placed elsewhere. This suggests that an optimal sensor placement would be very valuable for these smart bricks. The TKF is found to result in poor estimation of the slag penetration length. However, the EKF yields superior estimates even though the rate of change of the capacitance becomes higher when the slag reaches the sensor...

...For commercial application of the smart refractory brick in industrial gasifiers, many aspects need to be investigated. First, the brick needs to be tested under actual operating conditions for prolonged time. Second, impacts of the startup/shutdown and off-design operating conditions on the brick stability need to be evaluated. Third, response of the embedded sensors may be affected by unknown inputs. Fourth, because a wireless transmission system is being considered, there may be issues due to communication constraints, packet dropouts, and synchronization errors. The authors look forward to investigating some of these aspects in the near future.

One wishes the authors luck in a country, this one, where the three branches of government are controlled by people who hate science because they're too stupid to know any.

I appreciate the work of Dr. Debangsu Bhattacharyya, as well as his courage to bear his very cool name right in the heart of Trump country, West Virginia.

Interesting work.

Enjoy the rest of the work week.

A Polly Arnold Review on the Organometallic Chemistry of Neptunium.

There's this great video by Norah Jones during a tribute to Graham Parsons where in preface to performing his song "She" she declares that on listening to every song performed at the tribute she said, "Oh that's my favorite song..." and then declares that "But this is really my favorite song..." (It's a wonderful performance.)

My kid, who I am happy to report is visiting me this weekend - coming home from college to celebrate my birthday - always laughs at me because every time I see a poster on a wall in a university building referring to the chemistry of element - any element - I say "That's my favorite element..."

"Dad," he says, "Every element is your favorite element." (That's not true. I don't care all that much about the chemistry of terbium, or for that matter lutetium.)

One of my favorite elements, really, is the element neptunium, since I regard it as a key to decreasing the risk of nuclear war as close to zero as is possible, via the "Kessler Solution." (We cannot uninvent nuclear weapons, nor can we ever eliminate the risk of nuclear war, since the supply of uranium is inexhaustible. I explored this point elsewhere: On Plutonium, Nuclear War, and Nuclear Peace

In another post on the same website I wrote about some interesting chemistry associated with the actinide elements, noting that the Nobel Laureate who had, in many ways, the greatest effect on day to day life of any Nobel Laureate, Fritz Haber, since despite its greatly problematic environmental consequences, the world's food supply depends on the Haber process, noted very early on that uranium was likely to be wonderful catalyst for nitrogen fixation: Uranium Catalysts for the Reduction and/or Chemical Coupling of Carbon Dioxide, Carbon Monoxide, and Nitrogen. In that post, I discussed the work of Polly Arnold, a world leader in organoactinide chemistry.

Now Dr. Arnold has written a review article in one of my favorite journals, Chemical Reviews:

Organometallic Neptunium Chemistry (Chem. Rev., 2017, 117 (17), pp 11460–11475)

An excerpt from the text:

The only neptunium (93Np) available on Earth is man-made. The element has 24 radioisotopes; (14) the most stable are 237Np [t1/2 = 2.144(7) × 106 y], 236Np [t1/2 = 1.54(6) × 105 y], and 235Np [t1/2 = 396.1(12) d], while all the remaining have half-lives of under 4.5 days, with a majority below 50 min.(15) The longest-lived nuclide, 237Np, has a half-life 2117(24) times shorter than the age of the Earth, thus no primordial neptunium is present today. This notwithstanding, accurate α-ray measurements of concentrated uranium ores have allowed the direct detection of natural 237Np at the maximum mass ratio to 238U of 1.8 × 10–12, as a result of the neutron activation and decay products.(16) The isotope 237Np is typically produced from the β decay of 237U [t1/2 = 6.749(16) d], being about 0.03% of the total material in spent commercial uranium fuel rods and about 5% of that of plutonium. Around 50 000 kg of the element is produced annually, essentially pure from spent fuel via the PUREX (plutonium uranium redox extraction) waste separation process.(17) The long half-life of 237Np makes it a major contributor to the total radiation dose remaining after spent civil nuclear waste has been stored for tens of thousands to millions of years. Additionally, if plutonium has not been recovered prior to disposal of waste, then additional 237Np will be formed from 241Pu via α-decay, prolonging the long-term radiotoxicity of the waste.

The density of neptunium, is, by the way, 19.5 tons per cubic meter, meaning that all the neptunium produced each year would fit into a cube 130 centimeters on a side. (This compares favorably with dangerous fossil fuel waste which at 30 billion tons per year, can never be contained under any circumstances.) However no one could ever construct such a cube, since 19.5 tons of neptunium greatly exceeds its critical mass, and this being true, the element is a very wonderful potential nuclear fuel.

50 tons of neptunium is a very valuable resource, especially because of the very interesting property metallic neptunium has of forming a low melting eutectic with metallic plutonium that makes for interesting possibilities for the LAMPRE type reactors that were explored by a generation smarter than ours, in the mid twentieth century.

One hopes that a future generation, smarter than ours, will utilize this neptunium resource to clean up the intractable mess with which our irresponsibility and dumb assed ideas and fantasies has left them.

They may not, and should not, I think, forgive us.

Chemical Reviews, as an aside, is a wonderful place for chemists to catch up in areas in which they are non-specialists. I love that journal.

Enjoy the rest of the weekend.

Oh oh. A plutonium powered satellite hit the atmosphere and the SNAP device vaporized.

As the beautiful Cassini mission comes to an end, I am reminded that television physicist with a cool hair cut, Michio Kaku, opposed the mission, since he was concerned it would crash into the earth, the plutonium RTG - the same kind of device that ironically went to, um, Pluto (for which the element is named) - and wipe out all life on Earth.

And Michio Kaku would know, since he's a famous physicist who appears on TV all the time.

And, now, I have the unpleasant duty to inform you that the very event he feared has happened, and has been reported in a major scientific journal; be scared; very scared.

Here's the report: Atmospheric Burnup of a Plutonium-238 Generator (P. W. Krey, Science 158 (3802), 769-771 1967)

Excerpts from the text:

On 21 April 1964, a navigational satellite employing a SNAP-9A generator (Systems for Nuclear Auxiliary Power) did not reach orbital velocity because of a rocket failure after launch. The SNAP-9A generator is a nuclear fueled power package which converts the heat developed by a radioactive source into electrical energy, contains about 17 kilocuries of 238Pu and weighs 12.3 kg (1). Since 238Pu is a highly toxic nuclide, and since bone is the critical organ for soluble plutonium and lung for insoluble plutonium, considerable interest was exhibited in the ultimate fate and disposition of the 238Pu. Korsmayer (2) estimated that the satellite entered the atmosphere at about 150,000 feet (46 km or 46,000 m) over the Indian Ocean in the Southern Hemisphere. There are three alternatives as to what could have happened when the SNAP-9A reentered the atmosphere. One is that it plunged intact into the Indian Ocean leaving little or no remnants in the atmosphere. A second is that the heat of reentry into the atmosphere completely consumed the device and the pyrophoric 238Pu ablated into small particles.

Now the bad news:

A synoptic distribution of the debris on a global scale for the period of January to March 1966 is shown in Fig. 2. One can see from this distribution that little SNAP-9A has passed from the stratosphere into the troposphere by this time. The low concentrations in the equatorial stratosphere, increase in altitude of the concentration contours at the equator, uniform concentrations in the Northern polar stratosphere, and the bulk of the SNAP-9A debris in the middle to upper latitudes of the Southern Hemisphere are clearly discernible. This distribution is in accord with Machta's model of stratospheric winter circulation with each hemisphere (4); Machta describes a rising, air column in the equatorial regions, a poleward flow at about 110.000 feet, and a downward flux at middle and upper latitudes…

… By integrating the contours in Fig. 2 (6), a total stratospheric inventory of 15 kg of SNAP-9A 22 Pu or 88 percent of the 17 kg in the original generator can be accounted for. Of this, 80 percent resides in the Southern Hemisphere stratosphere, while only 20 percent was transported into the Northern Hemisphere. Surface air concentrations and deposition values of SNAP-9A 2' Pu in the Northern and Southern Hemispheres will ultimately reflect this 4 to 1 proportion. Based upon this inventorv of SNAP-9A 2. Pu, we conclude that the generator completely burned up during reentry and ablated into small particles.

I conclude we're all going to die.

As for Michio Kaku, I have had occasion to watch him on TV. He has a very cool haircut and it makes his kind of slightly supercilious lectures a little bit more tolerable. I can't say I've watched a lot of his shows, but he certainly does seem to know something about stars and stardom.

But again, I haven't watched him too much. I'm not all that much into television physicists.

I will say this. Scrolling ten minutes through Cassini pictures is, for me at least, worth a lifetime of Michio Kaku television appearances.

Just this morning I was reminding of reading, a few years back, Jared Diamond's fabulous book, COLLAPSE: HOW SOCIETIES CHOOSE TO FAIL OR SUCCEED

In it he tells the story of the demise of the Greenland Norse, who he concludes died out because unlike the Inuit, who survived quite well in exactly the same region, even further North than the Norse, the Greenland Norse had some kind of cultural prohibition, a CULT prohibition perhaps, against eating Salmon.

This he concludes from the presence or absence of salmon bones in archaeological sites related to the two cultures. The Norse, he claims, would only eat grain and grain fed domestic animals, and died out, when the temporary warm spell that brought them to Greenland ended. The Inuit lived, eating Salmon, just as they had done for thousands of years.

This is actually relevant to the issue of plutonium burning up in the atmosphere in 1964. (Actually metric ton quantities of it were vaporized in nuclear testing before the SNAP9A plutonium RTG vaporized, but it was generally the 239-isotope and not the more radioactive 238 isotope that powers spacecraft.)

Our atmosphere is collapsing, more rapidly than ever before. We have a cult of so called "renewable energy" to address it, and we've spent trillions of dollars on this cult in just the last ten years, with the result that the rate of the collapse of the atmosphere is increasing, not decreasing.

It has been shown that in the last half a century, the fuel that displaced the most dangerous fossil fuels was nuclear fuel, including a healthy amount of plutonium. There are more than sixty billion tons of carbon dioxide that didn't get dumped into the atmosphere because of plutonium and uranium.

We, however fear plutonium just like the Norse apparently feared Salmon.

Diamond's right; societies choose to fail, apparently in the 21st century, it's all one society, spread across the planet.

And we are failing. In the next two weeks or so, we will reach the annual minimum for the sinusoidal seasonal variation in carbon dioxide atmospheric concentrations. It will come in at above 403 ppm. Ten years ago, the time at which our two trillion dollar expenditure on wildly popular so called "renewable energy" started, it was 381 ppm.

We'd all like to believe that all the responsibility for climate change resides on the right. But just as Lincoln blamed slavery on the South and the North in his 2nd Inaugural address, we on the left have our own guilt in the issue of climate change.

I personally think we should eat the salmon, but I predict we won't.

I'm sorry for this post, but I stumbled across this old paper while researching modern thermoelectric materials, a fascinating subject, and I just couldn't resist this note.

Have a wonderful Friday tomorrow and a wonderful weekend.

North Dakota.

My Favorite Things

Computational Screening of 670,000 Materials For Optimal Separation of Krypton From Xenon.

As I poke around this long weekend through scientific papers that I was inspired to collect but never actually read and filed properly I came across a really cool one, this one: What Are the Best Materials To Separate a Xenon/Krypton Mixture?

(Cory M. Simon†, Rocio Mercado‡, Sondre K. Schnell†§, Berend Smit†⊥, and Maciej Haranczyk*¶ Chem. Mater., 2015, 27 (12), pp 4459–4475)

Xenon and Krypton are very rare gases in the Earth's atmosphere, from which they are obtained industrially, because they are very useful, at considerable expense. (The most common use for xenon is in automotive headlights because its excited states decay to produce light that is very similar to daylight.)

The authors of the cited paper do quite a nice job in describing some of the background on these two gases, and I'll quote what they say in their introduction:

The noble gases xenon (Xe) and krypton (Kr) have several important applications.28 Xenon is used as an anesthetic 29−31 and for imaging 32 in the health industry and as a satellite propellant in the space industry.33 Both xenon and krypton are used in lighting,34 in lasers,35,36 in double glazing for insulation,37,38 and as carrier gases in analytical chemistry.39 Since krypton and xenon are present in Earth’s atmosphere at concentrations of 1.14 and 0.087 ppm, respectively,40 the conventional method to obtain xenon and krypton is as a byproduct of the separation of air into oxygen and nitrogen by cryogenic distillation.41 This byproduct stream from air separation consists of 80% krypton and 20% xenon.42 At Air Liquide, this mixture is compressed to 200 bar and stored in cylinders, then sent to a separate Xe−Kr separation plant to undergo another cryogenic distillation to obtain pure xenon and pure krypton.43 Cryogenic distillation for the separation of krypton and xenon has a very high energy and capital requirement, reflected by the cost of high-purity xenon, about $5000/kg.44

Both gases are considered in general general to be inert, and until 1962 they were thought not to exhibit any kind of chemistry at all. Since that time, it has been discovered that both gases can, in fact, react to form compounds, krypton only at very low temperatures. The chemistry of xenon, by contrast, is quite extensive. (I touched briefly on the discovery of xenon chemistry here: Neil Bartlett's superpowerful oxidants NiF6- and AgF4- and the preparation of RhF6.)

As noted by the authors, the separation of these two gases is energetically and economically expensive, and the purpose of their paper is to examine, by computer modeling, approaches to designing materials that can reduce the costs of their separation by a technique known as "pressure swing absorption," PSA which relies on a process in which a gas consisting of multiple components is pressured in a chamber in the presence of a solid material that has a capability to absorb one component in preference to another. (Home oxygen generators for medical use, and nitrogen generators in some scientific laboratories utilize a PSA approach to separate these bulk gases.)

It is possible to exploit, I suppose, the chemical differences between the gases to effect their separation, however in most cases the chemistry involves the use of highly reactive, corrosive and toxic fluorine gas. In the presence of water, xenon fluorides can hydrolyze to form xenon oxides which can be highly explosive.

By contrast, pressure swing absorption is orders of magnitude safer and most probably considerably cheaper.

Modern computers are extremely powerful compared with computers from even a short time ago, but computational power is not necessarily cheap or free when one considers very, very, very complex calculations.

Nevertheless in silico calculations can save far greater expense in screening for molecular structures - be they involved in medicinal chemistry or in materials chemistry, such as being explored here - that accomplish these kinds of tasks.

Candidate materials for the separation of these gases exist in quite an array of differing types, again, I'll let the authors describe them:

By combining different molecular building blocks in their synthesis, advanced classes of nanoporous materials are highly tunable. For example, in metal organic frameworks (MOFs),6metal nodes or clusters form a coordination network with organic linkers. Other highly adjustable materials include covalent organic frameworks (COFs),7 zeolitic imidizolate frameworks (ZIFs),8 and porous polymer networks (PPNs).9High chemical tunability not only enables one to tailor-make a material for each application under a variety of conditions, but also inundates researchers with practically endless possibilities. Because of limited resources and time in practice, only a small subset of the possible materials can be synthesized and tested for each application.

The separations, as the author's note, in a pressure swing situation rely mostly on the size difference between the two types of atoms: The mean diameter of a xenon atom is 198.5 picometers, of krypton 183 picometers, a small, but significant difference. The idea is to structure the pores in materials so that xenon cannot fit into the pores while krypton can, or conversely that krypton can easily diffuse out of the pores while xenon can do so only slowly. Besides size, differences in their electronic structure - which accounts for the differences in their chemistry - can also be exploited without actual chemical reactions taking place.

Their computational approach, which they describe as considerably streamlined in terms of the computational algorithms utilized previously for one class of possible absorbents, metal organic frameworks (MOF), the previous approach being described as "brute force" is described in the following text:

The workflow of our method to screen about 670,000 structures in the Nanoporous Materials Genome is illustrated in Figure 2 and consists of the following six steps. (1) Characterization: we compute the feature vector of each material, whose components are seven quickly computed structural descriptors. (2) Selection of the training set: we use a diversity selection algorithm76 to ensure that our training set of materials adequately covers our seven-dimensional feature space. (3)Label the training set: we perform binary grand-canonical Monte Carlo simulations to compute the Xe/Kr uptake in the training set. (4)Training of the forest: we use these training examples to grow a forest of decision tree regressors to predict Xe/Kr separation performance from the structural descriptors. (5) Prescreening: we run the structural descriptors of the remaining materials through the trained forest of decision tree regressors. (6) Materials discovery: if the forest predicts the material to be promising for Xe/Kr separations, we run grand canonical Monte Carlo simulations to refine the prediction.

The authors in this screening process describe two known materials which may be useful in these separations:

Many materials in our database are predicted to have better Xe/Kr separation performance than CC3,104 a leading material for Xe/Kr separations.44 Our models predict that the two most selective materials in the Nanoporous Materials Genome are JAVTAC, an aluminophosphate zeolite analogue,106 and KAXQIL, a calcium coordination network.105 Both materials have been synthesized but not yet tested for Xe/Kr separations. We hope that our open database of simulated Xe uptake and Xe/Kr selectivities (http://nanoporousmaterials.org/xekrseparations/) will inspire the synthesis and characterization of a new material for Xe/Kr separations.

This paper has been cited extensively since its publication two years ago, and it might be fun, if I find the time once my favorite academic libraries reopen after the holidays, to look into these citations to see if these predictions have been experimentally confirmed.

It is interesting to note that isotopic ratios of these gases tell us a lot about the history of this planet, owing to the fact that actinide elements, for example, long lived plutonium-244 - which is known to have been a constituent of the early Earth (because of xenon isotopes) - spontaneously fission in a characteristic way that causes a traceable signature of geological history.

(See for example: Xenon isotope constraints on the thermal evolution of the early Earth (Nicolas Coltice a,⁎, Bernard Marty b, Reika Yokochi c, Chemical Geology 266 (2009) 4–9))

As the authors of the original paper note, these separations would also be valuable in the processing of used nuclear fuels, because both elements are fission products. No radioactive isotopes of xenon are very long lived in nuclear fuels; they rapidly decay into other isotopes. Xenon-135 has the highest neutron capture cross section of any nuclide known, it is rapidly converted into non-radioactive xenon-136 in the neutron flux in the core of nuclear reactors. Because of this, it does not accumulate, and in any case its half life is on the order of hours, not days. (The presence of xenon-135 in reactor cores where it is the cause of an effect known as "xenon poisoning" played a role in the very stupid decisions made by the operators of the Chernobyl reactor that exploded: Their decisions, made late at night, to remove the control rods from the reactor was intended to overcome xenon poisoning effects.)

Fission gases contained in fuel rods are therefore highly enriched in these valuable gases, and in theory they could be collected from used nuclear fuel for use, especially xenon.

Krypton contains one relatively long lived radioactive isotope, krypton-85, and its detection has been utilized to identify nuclear explosions (both atmospheric and underground) as well as reprocessing of nuclear fuels around the world, since it is generally vented to the atmosphere rather than recovered for use. (Radiokrypton-85 I think is possibly a very useful material for providing a continuous portable light source for remote locations or as a continuous power source.) Venting it to the atmosphere is probably not a very dangerous practice however, certainly not at the level of venting dangerous fossil fuel waste to the atmosphere, since dangerous fossil fuel waste and dangerous biomass combustion waste combine to cause 7 million deaths every year.

Krypton 85 can also be stored to provide a source of non-radioactive rubidium-85 which would be less radioactive than natural rubidium, since the latter contains the naturally occurring long lived radioactive isotope rubidium-87. However, except for esoteric research purposes, I cannot imagine that there is much call for isotopically pure rubidium-85, but hey, you never know.

Some folks might find all of this stuff as interesting as I do.

Enjoy the rest of the labor day weekend.

Trump Administration Announces New $20 Bill Design Honoring Harriet Tubmans Owners

WASHINGTON—Saying they wished to pay tribute to the legacies of these distinguished, law-abiding Americans, the Trump administration announced Friday that a long anticipated redesign of the $20 bill would honor Harriet Tubman’s owners. “These were patriotic business proprietors who followed the laws of their time to further their economic interests, and this new currency design finally recognizes these enterprising individuals for their success,” said Treasury Secretary Steven Mnuchin at an afternoon press conference, explaining that the bills commemorating those responsible for enslaving the famed abolitionist, Civil War nurse, and women’s rights activist for the first three decades of her life would be in circulation by 2018. “For too long, we’ve overlooked the achievements of these upstanding citizens and prosperous agriculturalists...

The Trumpist $20 bill

Nature: Cassini's 13 Years of Stunning Saturn Science - in Pictures.

Cassini’s 13 years of stunning Saturn science — in pictures

It's brief, but opened sourced; I recommend taking a look at the PDF version.

I love this graphic:


The mission was launched 20 years ago, in 1997. The world seemed so full of hope then; Bill Clinton was President, and even though that racist freak Newt Gingrich was in control of the House of Representatives, it seemed like the country was on the right track in spite of his ignorance now matched by other racists, Paul Ryan and the worst racist in the Senate, Mitch McConnell.

These creeps would have never funded science like this.

No one thought, in 1997, especially not me, that the entire country would be controlled by Neo Nazis who hated science, but here, 20 years later we face this.

The United States was a great nation in 1997; a nation that could launch the Cassini mission. It is terrible how far we have fallen in so short a time.

Let us hope and work to make it that kind of nation again, not a nation of Klansmen, but a great nation that can do things like Cassini. Let this not be the last outstanding space science and engineering to come from our country!
Go to Page: « Prev 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Next »