Site running on zola and NixOS

Published: 2022-04-04; Last edited: 2022-04-04

The last few days I converted my blog from a makesite based site to a zola site. For that I had to create templates out of the makesite templates. My plan is to create a zola-theme out of it. I also implemented a small preprocessor that can convert latex syntax that is used in the markdown-files into katex-html-code. Thus the formulas and equations you see are static.

The source of acousticcloud.net can be found here.

Additionally I switched the server from my hoster all-inkl.de to a nix.os server hosted on a hetzner vps.

I also added a photos section. For the gallery I use CSSBox.

Inspired by our hacc infrastructur I set up a NixOS server to run some services on it. One of them is this blog. I'll publish the nix files when I'm done with cleaning them up.

Colloquium on granular matter in space

Published: 2019-01-27; Last edited: 2019-01-29

On December 21st 2019 I attended a colloquium on granular matter in space. DLR Oberpfaffenhofen invited to that event. The speaker was Professor Mattihas Sperl of the Institute of Materials Physics in Space of the DLR.

I did not take a lot of notes, so this post will be little shorter than the last time.

At the beginning of the talk Prof. Sperl dropped some words I could not relate to directly (especially the first two): Soft matter, dusty plasma and space conditions.

The knowledge of granulate matter is still limited and their fundamental physical properties are still not completely understood. Especially the interactions with the environment but also within the medium is researched on. That way the material behavior can be analyzed

He compared the vast knowledge in the aeronautics sector about aerodynamics with the possibilities to model and calculate the aerodynamic behavior of planes or airfoils. Also wind tunnels are a common tool for well over 100 years. This all is not the case with granulate matter.

Some problems that can be analyzed with more knowledge of granulate matter are the rings of Saturn, the formation of planets, the soil and process engineering.

Granulate matter can be divided into three categories with one common denominator: heterogeneous transmission of force. The three categories are: solid, liquid and gas. That means granular matter can behave, depending on its environment and the situation the matter is in, similar to a solid, a liquid or a gas.

The experiment that he presented (as a video) showed the non-intuitive behavior of granulates. Take 100 particles, place them in a partially partitioned box (50/50 on both sides), shake the box until the particles are spread out evenly and then slowly decrease the shaking. What is the distribution of the particles now? See this video. The effect is called Maxwell's demon.

At the end of the talk he showed, with the right amount of money, it's already possible to print a moon station out of the granular matter that can be found there, the regolith.

At the end of this post I'll list some more notes I took during the colloquium, unfortunately I do not remember enough to fill those point with more content.

NYT research on Libyan Coast Guard

Published: 2019-01-27; Last edited: 2019-01-28

The New York Times did research an incident that the Libyan Coast Guard caused in November 2017. I heard about that incident some time ago in a podcast. Here are the links:

How We Made an Invisible Crisis at Sea Visible, NYT
Es ist Mord, Spiegel.de (in German)
WR838 Seenotretter Johann Pätzold, WRINT (podcast, in German)

CO2 emission budget for 1.5 °C increase

Published: 2019-01-14; Last edited: 2019-01-16

The Special Report on Global Warming of 2018 gives some numbers on the CO2 emission budgets. One important number is 420 Gt of CO2 (see, in German). It's the amount that we, the people of the earth, are allowed to emit to not break the 1.5 °C temperature increase by a 66% chance (in this post called TEB, total emission budget). The amount of CO2 looks huge, but if you know that we emit around 35 Gt to 40 Gt a year it's not that much.

As I'm trying to get the facts on climate change a little more comprehensible I thought up some new ideas how to interpret the 420 Gt.

Like my post about ozone and CO2 I like the idea to use the height of the CO2 layer and its change caused by additional emissions. So how thick is the additional layer of CO2 that is "allowed"? And how much mass per square meter is this? Or what area does one ton of CO2 of that budget cover?

Here are the results:

hTEB=mTEBρCO2AE=4201012 kg1.977 kg/m3510075000106 m2=0.4165 mh_{TEB}=\frac{m_{TEB}}{\rho_{CO_{2}}A_{E}}=\frac{42010^{12}~\mathrm{kg}}{1.977~\mathrm{kg/m^3}51007500010^{6}~\mathrm{m^{2}}}=0.4165~\mathrm{m}

This doesn't sound like a lot! One calculation that I found even more interesting is the CO2 per square meter:

mTEBAE=4201012 kg510075000106 m2=0.8234 kg/m2=823 g/m2\frac{m_{TEB}}{A_{E}}=\frac{42010^{12}~\mathrm{kg}}{51007500010^{6}~\mathrm{m^{2}}}=0.8234~\mathrm{kg/m^{2}}=823~\mathrm{g/m^{2}}

That is an impressive number! Basically by emitting more than 823 g of CO2 you get 1 square meter of the earths surface breaking the 1.5 °C barrier! To get an idea of what 823 g means: A modern car emits around 100 g to 150 g per kilometer!

The last calculation is the area one ton of CO2 will push to the 1.5 °C increase of temperature.

AEmTEB=510075000106 m2420109 t=1214.46 m2/t34.852 m2/t\frac{A_{E}}{m_{TEB}}=\frac{51007500010^{6}~\mathrm{m^{2}}}{42010^{9}~\mathrm{t}}=1214.46~\mathrm{m^{2}/t}\approx 34.85^{2}~\mathrm{m^{2}/t}

In Germany the average emission of CO2 per person and year is around 9 tons (source). That means 10930.14 square meter (roughly (100 m)^2) reach the 1.5 °C increase per year and person.

Colloquium on data science in earth observation

Published: 2018-11-26; Last edited: 2019-01-02

On November 26th 2018 I attended a colloquium on data science in earth observation. DLR Oberpfaffenhofen invited to that event. The speaker was Professor Xiaoxiang Zhu of the Remote Sensing Technology Institute (IMF) and the TU Munich.

The Copernicus satellites of the Sentinal series were mentioned at the beginning of the talk. Because of their high spacial resolution and their reliable continuous and daily output of data they are a good source for further analysis on its own or in combination with other data sources. Also the data is openly available.

Two companies offering scientific earth observation data were mentioned as well. Unfortunately I do not remember exactly in what context those two were mentioned. Compared to the for example Sentinel data you have to purchase the data those two companies offer. The first one was "Descarts Lab". They offer a demo map were you can see some data. The second one is "Orbital Insight".

An impressive animation that was shown was the shrinking of the arctic sea ice over the last decades by C. Künzer of the DFD. I can't find a link yet. As an alternative I can offer you this one.

One acronym that was dropped was AI4EO{\mathrm{AI_{4}EO}}(https://eo4society.esa.int/wp-content/uploads/2018/09/ai4eo_v1.0.pdf). It stands for "Artificial Intelligence for Earth Observation". It's a project or better a new concept or approach for enhancing earth observation by using AI or neural networks in earth observation.

Next Prof. Zhu mentioned the link between the real world physics (and science) and the results the complex algorithms used show up. As an example she showed an animation of the Berlin main station. With the help of the TanDEM-X data it was possible to show the "breathing" of the building during the year because of thermal expansion. It works like tomographic slices to rebuild a 3D-model of the object that was scanned. With the X-Band used for TanDEM-X a theoretical resolution of around 1 m is be achievable but with the amount of data collected over the years it is possible to extract data of much higher spatial resolution ("Persistent Scatterer Interferometry").

More complex and detailed analysis with potentially higher resolution can be achieved by combining different data sets of different sensors. One aim is to use the EnMAP data. The satellite EnMAP is planned to be launched to space in 2020.

Prof. Zhu than focused on the AI respectively deep learning itself. She compared the first generation neural networks with modern ones. The number of hidden layers rose from 1 to up to 1000 layers. As I'm totally new to the topic I can't explain what that means at this point and first have to do a little research. Besides the obvious of reading the wikipedia article one can read the paper/article "Deep learning in remote sensing: a review" that Prof. Zhu et al. published in 2017.

One question Prof. Zhu asked at this point:
What makes deep learning in earth observation special?

A small wrap up about the use of deep learning in earth observation was:

Hyperspectral analysis was the next topic. It covers the third point above. For earth observation a lot of different wavelengths are used to extract information of the atmosphere and/or surface of the earth. Of course it is quite sensible to combine all of the data collected by the different instruments scanning the earth. As those datasets are (becoming) huge as more and more satellites in the past present and future collect data with increasingly high resolution (time and spacial) algorithms have to take over the job.

Open issues or questions that were mentioned are:

Some aims mentioned:

A fascinating idea is to combine data of classical earth observation with data collected by social media platforms like twitter or google maps with it's photos made by users. The name of this idea is So2Sat (Social Media to Satellite). In combinations that means:

This way it is possible to automatically generate complex 3D maps with annotations about the function of certain buildings.

Prof. Zhu showed a comparison of OSM data and twitter mapping. The commercial places were accurately predicted!

Prof. Zhu also mentioned competitions in data science. Some bullet points were:

At the end of the colloquium there was a small Q&A. Only two questions/concerns were mentioned:

During research for this article I stumbled over this page:
Awesome Satellite Imagery Datasets

The history of climate change science

Published: 2018-10-20; Last edited: 2018-10-21

After searching and finding the video "The Unchained Goddess 1958 - Bell Science Hour (Discusses Weather / Climate Change)" that was mentioned in the New York Times article "Losing Earth: The Decade We Almost Stopped Climate Change" I was curious if I could find the important works and articles by the early climate scientists that were mentioned in the NYT article. Here they come:

John Tyndall, 1862

Svante Arrhenius, 1896

Svante Arrhenius, 1908, pages 51 to 63

Guy Stewart Callendar, 1937

A nice summery by sciencehistory.org

Impressive graphs showing climate change

Published: 2018-10-20; Last edited: 2018-10-20

Just some links to impressive graphs and animations visualizing climate change.

https://xkcd.com/1732/

http://openclimatedata.net/climate-spirals/

http://www.climate-lab-book.ac.uk/spirals/

Power needed for electric cars and its CO2 emission

Published: 2018-10-15; Last edited: 2018-11-22

Just some calculations, because I was curious. The Kraffahrtbundesamt says the kilometers traveled by all road vehicles (KFZ) in 2017 was 730 000 000 000 km. In average an electric car needs around 168 Wh/km (summer) and 225 Wh/km (winter) (heise.de)

730000000000 km(168+225) Wh/km/2=143.445 TWh730000000000~\mathrm{km}*(168+225)~\mathrm{Wh/km}/2 = 143.445~\mathrm{TWh}

All power generated in 2017 was 549.9 TWh in Germany (energy-charts.de) with 38.2 % coming from renewable sources. That means all power needed can be produced by renewable energy. So nothing utopian, though also not really an easy task.

Typical for automobile batteries could be a capacity of 50 kWh and a live expectancy of 500 charges.

143445000000 kWh/(500 ch./bat.50 kWh/ch)=5737800 bat(teries)143 445 000 000~\mathrm{kWh}/ (500~\mathrm{ch./bat.}* 50~\mathrm{kWh/ch}) = 5 737 800~\mathrm{bat(teries)}

So 5 737 800 batteries are wasted per year.

That sounds like a lot. Compared to 10 000 000 per year of normal car batteries (12V) (welt.de, german) it is still a lot, especially if you count in the price.

110 kg CO2 per 1kWh are used for producing a battery (Lithium ion, wiki).

must be updated: 525600050 kWh110 kg/kWh=28908000000 kgCO2=28.908 MtCO25 256 000 * 50~\mathrm{kWh} * 110~\mathrm{kg/kWh} = 28 908 000 000~\mathrm{kg_{CO_{2}}} = 28.908~\mathrm{Mt_{CO_{2}}} per year!

909 Mt of CO2 were emitted in Germany in 2016. 166 Mt was produced by the transport sector (I can't find the numbers for road traffic). If you use 135g/km (auto.de, 2013) CO2 as a ballpark figure normal the emission is 98 550 000 000 kg = 98.55 Mt.

489 g / kWh of CO2 are emitted in germany (2017, umweltbundesamt). That means that 0.489 kg / kWh * 131 400 000 000 = 64 254 600 t = 64.25 Mt are emitted for powering the road traffic sector. In sum (battery production and power for moving the cars): 64.25 Mt + 28.908 Mt = 93.16 Mt ! That is roughly same as the emission of conventionally powered road traffic (without counting in the power needed to extract oil, produce gasoline and diesel)!

< to bee continued or at least edited >

Sentinel-5 Precursor

Published: 2018-09-25; Last edited: 2018-10-01

On Tuesday September 24th 2018 I attended a scientific colloquium at DLR Oberpfaffenhofen. The topic, presented by Dr. Diego Loyola of the EOC, was the new satellite Sentinel-5 Precursor with its new instrument, the spectrometer Tropomi. Here is the official website of the Sentinel-5 Precursor of ESA and here a news article by DLR (in german). Sentinel-5 Precursor is part of the Copernicus program.

It was quite fascinating seeing the advancement in technology. The difference between the old instruments like OMI, GOMOS and SCIAMACHY is huge. A factor of 100 in spatial resolution was mentioned. An impressive comparison of the pixel size can be found here (source). The accuracy should be 10 to 15 times better (source is the german podcast auf distanz).

One example even showed how certain measurements (I think it was of SO2) took one year with one of the older instruments and just one day with the new Tropomi including a better spatial resolution. Incredible!

Tropomi primarily does column retrieval of the ozone (O3), nitrogen dioxide (NO2), sulfur dioxide (SO2), formaldehyde (HCHO), aerosols, clouds, (CO) and methane (CH4). The podcast mentioned above also mentioned water vapor (H2O)

The data of Tropomi is public, though I did not try to download them yet. It seems you just have to register here and then you are free to go.

The future will bring another satellite, the Sentinel-4, with similar performance but with a geostationary orbit above Europe. It will deliver an hourly update of the condition of the atmosphere.

http://seom.esa.int/atmos2015/files/presentation79.pdf

I got another chance to see the talk. This time at the LMU and its Institute for Meteorology. The talk took place on February 5th 2019.

motivation stratecic plan for us climage change sicence eartg urban system

korrelation menschen und co2

so2 konnte schnell redztuert werden -> co2 motivation

14 kg luft 1.5 essen und trinken

7 milloin pro jahr sterben an luftverschmutzung

ipcc 2014 dadiative forcing diagramm components of radiative forcing

gome -> sciamachy -> gome 2 metop-a -> metop-b -> sentinel-5p (tropomi beste intrumente weltweit)-> gome 2 metop c -> sentinel 4 -> sentinel 5

cizrtesy cudo fries schönes bild mit spektren der verschiedenen spektren

uv/vis/nir/swir auflösung 3,5 x 7,5 km

sentinel volcanic eruption mount agung 27.11.2017 vergleich tropomi gome

machine learnung merging multi decadel data

the operatilonl cloud retirebel algorthims machine learngin papaer

a nea health check of the ozone layer at gloabel and regional scales trend

increase of cdc-11 from easter asian -> emission of production weng et al. nature, 2017 polution produced -> goods used chart! very interesting!

so2 rduktion weltweit -> waldsterben. china 2005 -> 2016

co natürlich hcho durch bäume .> überall wo urwald ist sommer 2018 ->

wie ist es möglich beim runterschauen troposphere und gesamt ozone zu trennen?

knmi als bildquelle (niederländischs intitut) no2 feuer in Kalifornien so stark wie in innenstadt

methane in nordlichen teil afrika (ab äquator). warum?

so2 in dobesn unit? 10e-1 bis 10e2? was ist DU? nicht dobsen unit? wahrscheinlich schon!s

machine learning full physics inverse learning machine FP_ILM -> deterministisch oder

aerosol mit polarisation

o3 cams ecmwf o3 assimilation?

FP_ILM Effective Scene Albedo

stneinel 4 geo gems 2020 geo asia tempo 2022 usa geo

vorhersagen als teil der normalen wettervorhersage

< to bee continued or at least edited >

Tires as a source of microparticles

Published: 2018-09-09; Last edited: 2018-09-10

Some time ago I had a lively discussion about micro particles emitted by vehicles. The focus was on cars and bikes.

The devil's advocate, my discussion partner, claimed that the emission of micro particles by a bike is probably as high that of a car (as I remember, he meant the micro particles in the exhaust gas).

This discussion was on my mind since taking place. Today I read some articles and studies about that topic. Those focus on the emission of particles by tires but it's good start to get an overview.

This Spiegel-Online article (german language) let me to this study by Fraunhofer institute UMSICHT (german language). There the sources of micro particles (plastics) is listed in quite some detail. Unfortunately only the mass per year and person are listed not the mass per kilometer. To get a better overview I looked for statistics about the traveled distance of bikes and cars per year. Here you can find information about cars and here about bikes.

Another interesting article can be found in the current Suedeutsche Zeitung from 8th September 2018 or on their website behind a paywall (german language). Bikes

Bluetooth on Linux

Published: 2018-08-26; Last edited: 2018-09-09

One of my current projects is getting my tablet Lenovo Yoga 2 8" 851F running as a small Arch Linux machine. There are just two major problems concerning hardware and software support. One is that unfortunately the onboard sound card does not work yet (it's a WM????). The other, that there is just one USB port (Micro USB).

After finding this (?????????) little tutorial about installing Linux on the Yoga tablet I was able to get bluetooth running. And I realised that most of the So with bluetooth it's possible to transform the little 8" tablet into a nice little computer.

After quite an extensive search I found three mandatory accessories as bluetooth devices.

Ozone and CO2

Published: 2018-07-25; Last edited: 2018-10-10

Is was curious about the amount of CO2 that gets emitted by humans and how it scales in comparison to the, at first sight unbelievable small, ozone layer of the earth (Dobsen unit). So here come some small calculations:

The surface of the earth is roughly 510 075 000 km². According to this Wikipedia page the anthropogenic emission of CO2 (equivalent) is around 33.5 Gt a year. The density of CO2 is around 1.977 kg/m³.

m˙CO2ρCO2AEarth=33.51012 kg/a1.977 kg/m3510075000106 m2=0.033 m/a=3.3 cm/a\frac{\dot{m}{CO{2}}}{\rho_{CO_{2}}A_{Earth}}=\frac{33.510^{12}~\mathrm{kg/a}}{1.977~\mathrm{kg/m^3}51007500010^{6}~\mathrm{m^2}}=0.033~\mathrm{m/a}=3.3~\mathrm{cm/a}

That means every year we dump 3.3 cm of CO2 (equivalent) onto the earths surface. The absolute level of CO2 (pure) in the atmosphere is around 440 ppm or 3200 Gt. That means, with the calculation above, 3.17 m!

mCO2ρCO2AEarth=32001012 kg1.977 kg/m3510075000106 m2=3.17 m\frac{m_{CO_{2}}}{\rho_{CO_{2}}A_{Earth}}=\frac{320010^{12}~\mathrm{kg}}{1.977~\mathrm{kg/m^3}51007500010^{6}~\mathrm{m^2}}=3.17~\mathrm{m}

The same idea can be applied to plastics dumped into the oceans, see this link, and probably other stuff.

As a starting point for researching information about climate change this link seems to be a good resource!

First entry

Published: 2018-07-18; Last edited: 2018-07-18

This is my new blog!