You are currently browsing the category archive for the ‘Talks’ category.

the announced follow-up overview of NCSU’s MEAS Dept.

July 21 update: Dr Robinson’s pres courtesy copy


At the invitation of UT’s Department of Physics, Atmospheric Physics Research Group, Dr Walter Robinson, Professor at the Department of Marine, Earth and Atmospheric Sciences, North Carolina State University, presented a talk entitled “It’s All Connected: Model Biases, Gravity Waves, and the Dynamics of General Circulation”.

Dr Paul Kushner, formerly of GFDL, Prinston University, currently professor of Physics, UT, acted as host of the talk.

CONGRATULATIONS to him and his student, Lei Wang, for the successful thesis defence, which preceded the lecture!!!

After the talk, Lei Wang, theoretical physics no-longer-GS (!), and Yonggang Liu, paleo-physics GS, kindly provided useful pointers where to look for what I’m interested in – climate modelling heuristics, climate science for policy making, science~general public epistemic exchanges. We shared the observation that due to the tug-and-pull of specialization and genericity, it is not at all easy for a student (obviously, for a well established prof, too!!!) to go in sufficient depth for their research to “count”, while also obtaining at least a bird’s eye view of “hot topics” on the broader terrain of science. E.g., not all theoretical physicists would have more than an inkling of climate modelling, which is also a legit topic in theo phys. Neither would paleo physicists necessarily be ardent climate model developers, “training” a model to put out data that match the available record, however partial.

In anticipation of a courtesy copy of the PPt for the talk, I’m pasting the abstract below. [July 8 add-on: A few precious slides toward the end of the presentation, which deal with issues Dr Robinson officially labelled “general”, and during the pres referred to as “philosophical”, would be really good to link to! Right up ES’s alley, as already explored in a short series of posts commencing with <a href=”“Notes on Dr Balaji’s Talk…” .]

Abstract:

As global models evolve from climate models into Earth-system models, it is generally assumed that the basic dynamics of Earth’s fluids are simulated well. There remain, however, persistent biases in how models represent the dynamics of Earth’s atmosphere, even on the largest and putatively well-resolved scales. Thus, the development of better Earth-system models requires continued attention to how they represent the basic dynamics of the atmosphere and the ocean, at the same time that novel new processes are introduced. The best possible treatment of a biogeochemical process will likely fail if driven by erroneous dynamics.
No process in an Earth-system model is less glamorous than the extratropical zonal winds and their maintenance, yet significant model biases in zonal winds remain. These can be traced to uncertainties in gravity-wave drag, a which is poorly observed and understood but which plays a crucial role in the momentum budget of the atmosphere. Model zonal winds are sensitive to small errors in gravity-wave drag, for reasons that have only recently been understood and that will be elucidated in this seminar.
Failing to offer an immediate solution to this problem, a less hubristic approach to Earth-system modeling than that currently in vogue will be advocated.


I brought to the lecture my usual research questions:

  • Jim Hansen’s point about the impossibility for climate models to capture climate sensitivity, since there may (in his book, will) always be feedbacks we do not know about (The Storms of My Grandchildren, 2009; for the precise quote(s), see ES post Notes on Dr Balaji’s Talk, section Re the Computational Complexity of Models).
  • Uncertainty of climate forecasting (NB! which is NOT weather forecasting) as a routine factor in “doing climate science”
  • The importance of the ability of scientists to convey to the general public and policy makers that “uncertainty” IS NOT “unreliability” of science! To quote Dr Robinson, “We should be honest about uncertainty… It is a uniquely challenging problem that society is expecting us to do [me: in view of the implications for policy decisions invloving astronomical investments]”. I’d add, key players on the national and international politics scene to this day justify, e.g., not including climate change on the agendas of the June 2010 G8 and G20 meetings with scientists themselves not being “sure” Cf. post-G20 CBC interviews with former cabinet ministers of Canada!
  • The importance of educating non-climatologists [including scientists!] how to determine if they are being served “good” science, or politics-driven talking points (cf. James Hoggan’s exemplary investigative journalism account on the subject)
  • The ES audience can also expect some more text on the interdisicplinarity & collaboration profile of NCSU’s Department of Marine, Earth and Atmospheric Sciences, including their Climate Modelling Lab. Same as what I did for Prinston’s GFDL. (See post a-whole-prof-all-to-yourself-enviro-edu-prinstonu)


    Last updated: Tue, June 15, 2010

    link to courtesy copy of Dr. Venkatramani Balaji’s PPt. See Talk Announcement post for transcription of 3 slides of ecosonic interest – “How to Get to Exascale”, “Hardware & Software Challenges”, “Climate Sci – HPC Challenges”
    (Pointer to) UofT-hosted copy (???) of PPt expected on talk + abstract page


    Not having the requisite sci background, I’ll skip the technical core (a huuuuge pity!) and mention a few points of ecosonic interest that Dr Venkatramani Balaji brought up. I’ll also add my related search findings concerning the epistemic flows among actors of various import on the climate science and climate change stage.

    Re scientists and research units of any size “talking” to each other
    VB asked, How do you make climate models usable by a large number of people? (for the purposes of this section, I’m assuming |people=climate scientists|; re non-climatologists and non-scientists, see further below, VB’s “people” likely included those as well – per post-talk exchange; see note 1 re disciplinary labels)

    The efforts of the Modelling Systems Group at GFDL, which he heads, a.o.t., focus on developing metadata “in view of facilitating data management of large national and international modeling campaigns such as the IPCC”. In principle, they would be facing the challenges to the standardization of (semantic web) ontologies, which, e.g., ColumbiaU philosopher Barry Smith analyzes in detail. (cf. a pres I gave at UT a couple of years ago)

    On the eco-consonant side, climate model metadata standardization has advanced thanks to the Climate and Forecast Metadata Convention, which is adopted or “encouraged” by a number of research institutions in North America and Europe, (among 33 institutions and projects listed) e.g., the SeaDataNet partnership, dubbed “Pan-European infrastructure for ocean and marine data management”, which currently has 49 partners, and also plans to provide university-level training, Humbold (a EU project based in Germany), the University of Colorado’s Atmospheric Research center’s (known as UCAR) North American Regional Climate Change Assessment Program, the NERC RAPID THCMIP (Thermohaline Circulation Model Intercomparison Project) at the Natural Environment Research Council, UK. GFDL also uses the Convention, though it’s not on the posted list.

    Re scientists and technologies they rely on
    On the eco-dissonant side, VB mentioned among the challenges the fact that, e.g., integrated systems are assembled from multiple manufacturers, and enumerating the components used, he pointed out that this makes for numerous potential “points of failure” (Slide 32). Also, that “new programming models may be needed but are immature” (ibid.).

    Re the computational complexity of models
    A re-visit to my “chronic” query – computational complexity as regards manageability of 1) current (scientific and technological) knowledge 2) incoming (climate) data and new (scientific and technological) knowledge (cf. Hansen’s concern over current epistemic gaps). E.g., re 1) VB pointed out that “[e]xascale [meaning in the range of 1018 and up] software and programming models are expected by 2013, hardware by 2018″, though at present exceeding the 1 GHz limit is impossible for individual arithmetic processors and the components of a coupled system would stay at 105; so I wonder if fulfilling the 2013/2018 projections would be a prerogative of quantum computing under development. (see expectations of Canada’s Perimeter Institute) Re 2), my question is, How easy is it for the design of models to open up options for incorporating a new variable if/when necessary – without breaking stride? Would it be a piece of cake, adding/changing a few lines of code, or would reworking the program, a new software package edition, be in order?

    After all, global models failed to predict the recent rapid Arctic sea ice loss, according to leading US climate scientist Jim Hansen, who’s been dealing with modelling for over 30 years. (It seems this could easily be a case of 1) or 2) posited above, or both (?)) He notes in his 2009 book Storms of My Grandchildren that “[e]ven as our understanding of some feedbacks improves [basically he is saying global models are good for known feedbacks], we don’t know what we don’t know – there may be other feedbacks. Climate sensitivity will never be defined accurately by models.” (p.44, emphasis mine) Thus, he places models as a heuristic below paleoclimate studies and ongoing observations, even as he acknowledges that they have their uses. [see Note on Climate Models (still to do)]

    Surely, [June 28 update: in view of what models can handle within existing human climate knowledge, and the (at least theoretical) possibility to prompt human discovery, not intended by the design] Hansen’s “never” would depend on how advanced the technology that handles (in most general terms) variables is? Would it necessarily be an oxymoron to be programming for “what we don’t know”? After all, isn’t the “unknown” (un)consciously implicated in assessing the “probability” of something happening, which some models do? And more interestingly for the process of scientific discovery, [substitution June 28: following a deductive thought process,] would what is/may be part intuition/intuitive exectation help climatologists to put their finger on previously unknown/unacknowledged factors that contribute to the picture, [update June 28: whether the eye-opener comes independently of or through the modelling technology]? (cf. the experience of discovery of cytogeneticist and Nobel prize winner Barbara McClintock, e.g., per bio by E. Fox Keller, 1983 – see note 2 below)

    [June 26, 2010 update: I cannot believe I did not put this down – How close is modelling to being able to program for what a human can get out of ongoing observations and paleoclimate data. Are the connections humans can make way too loose to be formalized? Really, how up-to-date is what gets into a model – if according to Hansen “ongoing observations”, (alongside paleo data) would make a difference in weighing climate sensitivity?
    There ARE paleoclimate models, currently, but are they conversing (well) with programs for current predictions?]

    Re the human actors’ mindset
    It seems, then, that a big part of a good climate scientist’s mindset would be to handle equally well emergent and existing knowledge. This state of affairs would clash with “computer logic” to the extent that software models operate with closed sets of options – the knowledge they host would be part of what their designers know/have access to. Also, it could be that a climate scientist’s attention is trained on identifying patterns of behaviour of feedbacks (from observations and historically) and how well a model recreates history and anticipates future behaviour, whereas that of a programmer or a software engineer would be targetting what’s wrong with how a computer executes a program rather than with the truth value of the science fed into the program.

    Oops, essentializing and dichotomizing? [take a look at update June 18 – June 23, 2010, below] To the extent that it would help design a workable ecosonic model of Human-Human and Technology-Human Relatedness, yes. Noam Chomsky called this operation (unavoidable for him, fallacious for others) “idealizing the data” (in formal linguistics, I have to add).

    So, How can the two communities find a common “language” (used metaphorically, I do not mean Python, etc.) to build climate models that work, and do it well, at the same time spurring on technology to match ongoing developments in science? Added to that is the possibility that the options presented by a technology can inspire (serendipitously or otherwise) innovative ways to approach the “science” itself. Form influencing content, in most general semiotics terms.

    Re grist in the mill of disseminating and passing on scientific knowledge
    Lots of wisdom and communication mastery are needed to achieve eco-consonance in the case of communicating science to the public (which, I’d imagine, would include non-climate scientists, who’d want the “results”, not the “process”), ditto passing on this knowledge – in addition to knowledge of climate science per se – to future generations of scientists.

    In all www evidence, Princeton University’s Cooperative Institute for Climate Sciences CICS is already scoping out the former, while doing an excellent job of the latter. (see post A Whole Prof All to Yourself) In a quick post-lecture exchange, VB mentioned that GFDL (or CICS?) has included in the past seminars (incl. for grad students?) on how to communicate with the public.

    These skills would be mandatory for the purposes of providing decision support, as Stanford’s Stephen Schneider amply illustrates in his 2009 book Science as a Contact Sport, including in the context of negotiations for IPCC4’s WGII (check Vocab post) report between scientists (himself, a.o.) and government reps (he “converted” Kenya’s, if I remember correctly, delegate). Schneider records, with well timed anecdotal relief, the draining mind+rhetorics battles over, e.g., what scientists vs. policymakers mean by “confidence” vs. “high confidence”, “likely” vs. “very likely” as applied to CC, which “inspired” him to propose percentage quantification. (in a similar vein, see the Vocab post entry re IPCC’s euphemistic/diplomatic use of “climate change” in lieu of “global warming”)

    Intermediate conclusions
    A wide-range epistemic transfer, exchange rather, is needed within and between climate science and the software engineering/technological domain for quality knowledge production, with special attention to exchanges between scientists and future scientists.

    Since climate knowledge production depends on modelling technologies, it would seem as much as on data observation and paleoclimate studies, and conversely, climate modelling depends as much on climate knowledge as on technological competence, it would be extremely beneficial to prioritize close collaboration between climatologists and software engineers. In addition to the technical, programming side, model-building will crucially benefit from a legacy of requirements engineering and standardization methodology.

    To the observation that what SE has developed over and above programming may be/is largely irrelevant to climatologists, I’d advocate considering “translating”, not “transferring” that legacy, thus:

    * * * * * * * * * * A Sub-branch of SE – perhaps? * * * * * * * * * *

    As regards decision support, it requires accurate and adequately selected knowledge, and accessibly and diplomatically executed knowledge exchange among climate scientists, and between them and non-climate scientists, humanitarians, policymakers, the general public… Think of what is invloved in preparing the Assessment Reports of IPCC’s WGI (scientists), and WGII and III (interfaces with economists, policymakers et al.) and, ultimately, in the Synthesis Reports, based on the work of all three working groups.

    If any reprsentatives of (any of) the non-climatologist demographic groups above are to be made privy to “wassup with climate”, how are they not a “client” whose needs and background should be taken into account? Closer to the core of science, if the output of models were to count on a par with – if not census data – then official (scientific) publications, in the public domain and with proper credit and responsibility assigned, then any climate-savvy external scientist not privy to the workflow of a research community of any size, would also be a client, highly demanding, at that. Plus, even if a climatologist is designing the program for her-/himself, they are, after all, following some tacit requirements, as their own “client”. [check if Dr Balaji/s.o. else has a graphic representation of the varying scope of climate models req’s engineering, which I am assuming cannot equal zero, even when “doing it for oneself”.]

    Educating for climate-science-and-software-engineering hybridity
    In the education section, I’d like to mention that Dr Balaji is expecting the publication of a textbook he’s been working on, 1-2 years from now. He also develops courses he teaches at Princeton.

    What caught my attention was his emphasis on there being much more “downstream science” than there are scientists prepared to meet the climate analysis demand. He identifies this as a scientific scalability challenge. (Slide 33)

    Currently, at GFDL, which includes Dr Balaji’s Modelling Systems Group, I do not recall coming across a profile which explicitly features formal training in software engineering.

    Dr Balaji liked my idea of “hybrid” education, organically interfacing climate science and software engineering. In my terminological taxonomy, “hybridity” would mean superseding multidisciplinarity, and moving from the interdisciplinarity on to the transdisciplinarity stage. That is, going beyond “merely” juxtaposing self-contained disciplines (multi = Lat. “many”), and proceeding with epistemic exchanges (inter = Lat. “between”), and even with disciplinary merger (trans = Lat. “across”, “through”). [should link to my presentation on ***-disciplinarity]

    However, the “linguistic” (metaphorically speaking) correspondences between the two communities and their epistemologies regarding modelling are far from obvious.

    In principle, to the extent that the “language barrier” between an A and a B is overarchingly disciplinary, setting aside individual psychological specificity, “translation” between two distinct disciplinary mindsets may pose a problem, and “climate science” itself IS a crossroads of multiple disciplines, which multiplies the potential barriers to eco-consonant relationships. What happens with the addition of yet another player, software engineering (SE)? One mindset predicated on an open – and perhaps undefinable – set of options (the “not-knowing-the-unknown” problem per Hansen above), and another on an explicitly defined set of options (remember the colloquial idiom “engineering solution”?). Request: Pls keep in mind that this is only an abstract ecosonic model, in this particular case, also playing with stereotypes, certainly not meant to reflect the various degrees and shades, especially as related to specific (not excluding exceptional!) individuals 🙂

    If “translation lossy-ness” is what has so far prevented (it would seem, desirable, and urgent!) closer collaboration between climatologists and software engineers, then perhaps a good motivator for more extensive epistemic exchanges could be the opportunity to slim down each other’s Zones of Proximal Development? (see note 3)

    Once the leadership by already accomplished engineers and climate scientists is in place, carefully thought-out hybrid university-level (grad?) curricula/programs may be the path to developing the requisite “linguistic” skills of the future generations in the making. After all, in view of current IPCC estimates, well prepared climate-tackling scientists, including meta-science communication talent, would be needed for at least a century from now, moreover in top-priority mode.

    The catch in this uplifting scenario? See …From Contact to In-tact Sport post.


    NOTE 1: In this post I am using “climate science”/”climatology” as a shorthand for a variety of disciplines involved in the study of climate, as represented, e.g., in the research profiles at GFDL – geophysics, atmospheric physics and chemistry, oceanography.

    NOTE 2: Because this definitely is a book worth reading for those interested in scientific discovery, the exact bibliographic info:
    Evelyn Fox Keller. (esp. “Chapter Three: Becoming a Scientist.” In) A Feeling for the Organism: the life and work of Barbara McClintock. San Francisco: W.H. Freeman. (1983)

    For the other references, pls see post Researched CC Material (Ongoing)

    NOTE 3: Widely recognised psychologist and pedagogue Lev Vygotsky postulated that what he termed “Zone of Proximal Development” constitutes the difference between a student’s capacity to arrive at a solution on his/her own and the capacity to do so with the help of a more experienced teacher/adult… He also recommended to give students assignments in the ZPD, to encourage cognitive development. I couldn’t agree more, and would emphatically extend his recommendation to Any Learning Context at Any Stage in Life.

    update June 18 – June 23, 2010
    It is important to stress that technology is not devoid of scientific “texture”, and that science and technology are team mates, rather than rivals. Hence, the term “technoscience” (see Vocab entry). In establishing patters and physical dependencies between abiotic, moreover artefactual, units, software engineering, like e.g. cybernetics, is very much the counterpart of biology, medicine, whose units are biotic, which until not that long ago also entailed “non-artefactual”. But with cyborgs on the rise… (see Vocab entry)

    It must be the “applied” part of it that gave engineering the meaning in the idiom “engineering solution” (“mechanistic”, “operational”), e.g., looking to “fix” rather than “explain”, and (historically) kept it from entering university curricula until late in the 19th century, in North America, at least.


    Last updated: Tue, June 15, 2010

    link to courtesy copy of Dr. Venkatramani Balaji’s PPt. see below for transcript of 3 slides of ecosonic interest – “How to Get to Exascale”, “Hardware & Software Challenges”, “Climate Sci – HPC Challenges”.
    (pointer to) the UofT hosted copy of the PPt (???) expected here


    Dr Balaji is expecting the publication of a textbook he’s been working on, in a couple of years’ time. He also teaches courses in his area at Princeton.

    The talk was hosted by Paul Kushner, formerly of GFDL, currently Associate Prof, UT, and W. Richard Peltier, Professor, UT, both with the Atmospheric Physics Group.


    When: Tue June 8, 2010; 4:10pm Room TBA [Rm 408, 60 St George St]
    Presenter: V. Balaji
    Title: Climate Computing: Computational, Data, and Scientific Scalability

    Abstract:
    Climate modeling, in particular the tantalizing possibility of making projections of climate risks that have predictive skill on timescales of many years, is a principal science driver for high-end computing. It will stretch the boundaries of computing along various axes:

    – resolution, where computing costs scale with the 4th power of problem size along each dimension
    – complexity, as new subsystems are added to comprehensive earth system models with feedbacks
    – capacity, as we build ensembles of simulations to sample uncertainty, both in our knowledge and representation, and of that inherent in the chaotic system. In particular, we are interested in characterizing the “tail” of the pdf (extreme weather) where a lot of climate risk resides.

    The challenge probes the limits of current computing in many ways. First, there is the problem of computational scalability, where the community is adapting to an era where computational power increases are dependent on concurrency of computing and no longer on raw clock speed. Second, we increasingly depend on experiments coordinated across many modeling centres which result in petabyte-scale distributed archives. The analysis of results from distributed archives poses the problem of data scalability.

    Finally, while climate research is still performed by dedicated research teams, its potential customers are many: energy policy, insurance and re-insurance, and most importantly the study of climate change impacts — on agriculture, migration, international security, public health, air quality, water resources, travel and trade — are all domains where climate models are increasingly seen as tools that could be routinely applied in various contexts. The results of climate research have engendered entire fields of “downstream” science as societies try to grapple with the consequences of climate change. This poses the problem of scientific scalability: how to enable the legions of non-climate scientists, vastly outnumbering the climate research community, to benefit from climate data.

    The talks surveys some aspects of current computational climate research as it rises to meet the simultaneous challenges of computational, data and scientific scalability.


    Slide 31. How to get to exascale
    If individual arithmetic processors are going to remain at ~1GHz(109) how do we get to exascale (1018)? We need billion-way concurrency!

    • Components of a coupled system will execute on O(105) processors (driver-kernel programming model)
    • There will be O(10) concurrent components coupled by a framework (FMS, ESMF, PRISM)
    • We will reduce uncertainty by running O(10-100) ensemble members.
    • We will use a task-parallel workflow of O(10-100)to execute, process and analyze these experiments (FRE).

    Exascale software and programming models are expected by 2013, hardware by 2018.

    [What got pasted here from the PDF file as O, to my best surmise 🙂 😦 would be notation for processor speed/frequency (if Hz) – but what is the character? Lynne]

    Slide 32. Hardware and software challenges

    • We still haven’t solved the I/O problem.(Useful data point: our IPCC-class climate models have a data rate of 0.08GB/cp-h).
    • Integrated systems assembled from multiple manufacturers: chips, compilers, network, file systems, storage, might all come from different vendors. Many points of failure.
    • Multi-core chips: many processing units on a single board. Since our codes are already memory-bound, we do not expect to scale out well on multi-core.
    • New programming models may be needed, but are immature: Co-Array Fortran and other PGASl anguages, OpenCL.
    • Reproducibility as we now understand it is increasingly at risk: GPU for instance does not appear to have a formal execution consistency model for threads.

    Slide 33. Climate science: HPC challenges

    • Adopt high-level programming models (frameworks)to take advantages of new approaches to parallelism should they become operational.
    • Component-level parallelism via framework superstructure.
    • Approach models as experimental biological systems: single organism or “cell line” not exactly reproducible; only the ensemble is.
    • There is more “downstream” science than there are climate scientists: a scientific scalability challenge.
    • Use curator technology to produce “canned” model configurations that can be run as services on a cloud.

    December update:
    2 upcoming books: Jan 2011
    1) Principles of Planetary Climate (Cambridge University Press–January 31, 2011)
    Book description: Provides a unified treatment of the basic physical principles of planetary climate phenomena on the present and past Earth and other planets. An invaluable textbook for advanced undergraduate and graduate students, and a reference text for researchers. Lavishly supported by hundreds of creative and stimulating exercises, software, datasets and algorithms.

    2) co-edited with David Archer The Warming Papers (Wiley-Blackwell–January 25, 2011)
    From the Amazon product description: “The Warming Papers is a compendium of the classic scientific papers that constitute the foundation of the global warming forecast.  The paper trail ranges from Fourier and Arrhenius in the 19th Century to Manabe and Hansen in modern times. Archer and Pierrehumbert provide introductions and commentary which places the papers in their context and provide students with tools to develop and extend their understanding of the subject.”


    post 2B edited/reworked:


    institutional page
    re his textbook Principles of Planetary Climate, submitted to CUP
    Real Climate blog

    It has been a while since I enjoyed a couple of his talks as part of his full week of Noble Lectures (April 19-23, 2010), organized this year by the Program in Atmospheric Physics at the Department of Physics, UT. They are “right up the ecosonic alley”, so some thoughts below.

    The main attraction for me was the opportunity to hear a recognized scientist’s take on 1) the ethics aspects of scientific endeavour, and also 2) how the habitable conditions on Earth project into the universe at large (or at least as large a chunk of it as we can currently see).
    1. Climate Ethics, Climate Justice – Thursday April 22th
    Lecture Slides; Nine Billion Ton Hamster
    2. At the Outer Limit of the Habitable Zone – Friday April 23th
    Lecture Slides
    ————————————————–

    On the subject of “fairness”: The series of graphs on pp 41-46 (slides 38-43) is the first concrete comparative mapping I’ve come across of the US – China CO2 emissions contribution. France serves as a control case of sorts. China’s (roughly current) yearly cumulative emissions exceed those of the US, and either of these countries’ output by far exceeds that of France, p 41. By contrast, the US and France (as the runner up) are both larger contributors than China w.r.t. per capita emissions, p 42. Adding to the picture the standard of living (SoL) of the three, it looks like France is doing better than the other two, its SoL approaching that of the US, but its per capita emissions closer to China’s, whose SoL is considerably lower than that of the other two. (do not look for a SoL graph – the speaker was relying on “common knowledge”)

    Given its steep climb, even if much later than the head start the US and France had since the dawn of the Industrial Revolution mid-19 century, China is expected to catch up not just with France, but also with the US in CO2 Ems. Per p 46, at present China and the US diverge in CO2 usage: the pink area marks the US’s carbon “overdraft” (=pollution tonnes over and above where the output should be now), the green one China’s carbon “balance” (call it “credit”=less pollution tonnes than the projected limit). Prof Pierrehumbert pointed to the pink area, saying that “that’s where climate justice lies!”. Indeed, if international agreements were to hold legal sway. At least, it is clear that pushing for stricter measures via a vis China, as the US did at the Copenhagen COP15 in December 2009, did not have the back-up of a US clean(ed up) record.

    Another comment by Prof Pierrehumbert I’d rather attribute to an attempt to lighten up the tone, or similar. Comparing China and France, he suggested that if France managed to bridle in its Ems while maintaining its higher SoL, it may – idealistically – be possible for China to raise its SoL without increasing its Ems. The industrial heritage, the work force and intellectual/cultural capital are not comparable/interchangeable. Other than that, certainly, if collective international action kicks in, then there will be less fear that whoever reduces is being taken advantage of by those who don’t.

    Another point worth mentioning is albedo (solar reflectivity) engineering (e.g., spewing aerosols into the atmosphere) as a means of controlling solar radiation and, by extension, reducing temperatures (see sources in Albedo Engineering post). Prof Pierrehumbert labelled it a “moral abomination” without hesitation – and I agree with the sentiment. It is in the same “scientific” style as the use of DDT last century, and a number of other extremely harmful pesticides, whose side effects by far outweighed their presumed benefits, ditto bio-engineering, even the use of chemo and radiation therapy for cancer treatment. True, there may be more than enough cases where no (sufficiently effective) nobler approach is available, and the toss may well be between using drastic measures as a last resort and giving up altogether. However, if as Prof Pierrehumbert noted, AE and similar measures may also be misguided/miscalculated, then in the absence of a commensurate antidote, they would act as purposeful pollution. “Involuntary manslaughter” comes to mind, which may not be far behind “natureslaughter”, the roughshod approach matching in eco-dissonance the “epistemic imaginary of mastery” that generates it, and that epistemologist Lorraine Code identifies as the still-dominant imaginary in the Western world, not excluding the realm of science.

    A couple of comments on the “At the Outer Limit of the Habitable Zone” lecture. It was a detailed excursus in space (our galaxy) and time (e.g., back to the time of early Mars), comparing conditions on heavenly bodies (Prof P. did not use the term) such as Venus, Mars, the (Earth’s) Moon, Titan, to try and demonstrate how close they are, or have ever been/might ever be, to what constitutes the “habitable conditions” of our planet. The most frequently considered parameters were CO2, H2O, heavenly body size, gravity; related planetary and atmospheric chemistry. All of the above in the face of missing/unavailable/inaccessible data, which unavoidably gave the exercise a romantic twist, if you like. Notwithstanding the ensuing huge number of variables, that can clearly over-generate possible hypotheses far beyond the ones discussed, the journey was quite enjoyable.

    Quite delightfully, Prof P. concluded with – believe it or not – a poem featuring Sir Gawain taking his leave, the point being that the knight was rather unwilling to bid goodbye.

    Discordantly, yours truly had to jump in – after a number of noble technical questions – with an admittedly general public query: Should humanity be faced with a catastrophe, what are our chances of finding a habitable host, travel challenges aside? Having grown up as a sci fi fan, what I meant was, whether to his knowledge, researchers have been able to identify, or target potential candidates, for a habitable host, irrespective of the currently insurmountable speed-of-light barrier. The answer boiled down to, “In view of yesterday’s lecture, it is unlikely that we’d wipe ourselves out”. The conditions may be very unpleasant, but a resilient/resourceful species like ours will find a way to adapt (I imagine, technologically, and brrrr even genetically?). He mentioned “bubbles” on the surface, and looking at least half a billion years ahead, when the Sun will turn into a super-hot red giant, we might use super-size mirrors to reflect/deflect its radiation away from the Earth, thus putting off the planet’s ultimate destruction as we know it by a few million (?) years.

    I did not ask about planetary engineering, nor did I breach my favourite subjects

      strategic deep-space research for similar planets
      research on renewable energies

    In the latter case, including anything commensurate with Nikola Tesla’s project, the surviving memory of which has it that he was working on utilizing the Earth’s own atmosphere (tapping into the ionosphere’s energy) as a source of inexhaustible, free energy for all.

    Among numerous other Tesla YouTube videos – likely a school project (?) in the voices of two girls, Maia MacCarthur and Kiana Wilson: http://www.youtube.com/watch?v=gfvd1g_fcUote

    Calendar

    July 2020
    M T W T F S S
     12345
    6789101112
    13141516171819
    20212223242526
    2728293031  

    Categories

    © CreativeCommonsLicense

    Creative Commons License Img

    accurate quoting proper attribution by/on ES & of ES