# Aftermath of the solar eclipse

Posted 7 years ago
25963 Views
|
14 Replies
|
55 Total Likes
|
 Today one of the rare solar eclipses has taken place. In this community there were fantastic posts preparing us for the eclipse; Vitaliy Kaurov showed some nice image processing of photos of the eclipse. Solar eclipses have always fascinated people: in ancient times they were often feared. They were often supposed to be a bad omen and sometimes they actually were, for example for two astronomers who were beheaded on the order of Emperor Zhong Kang, because they had failed to predict a solar eclipse. Even before todays eclipse there were reports warning that Europe might face large scale black outs, because the power grids would be strained by a lack of solar power. This is why I decided to use Mathematica to analyse some data about the effects on the power grid in the UK. I also use data from the Netatmo weather station to analyse variations in the temperature in Europe due to the eclipse. Let's start with the temperature data. Aberdeen is quite close to the region of the total solar eclipse:so we can hope to see some effect. First, we need to find out when the solar eclipse took place in Aberdeen (UK)where I live.I download the respective data from a website: Import["http://www.timeanddate.com/eclipse/in/uk/aberdeen", "Data"][[2, 2]] // TableForm This givesSo the eclipse started at about 8:30am in the morning and ended at 10:46am. I will use this to mark the relevant time intervals in some of the images in this post. Next, I need data on the temperature. I own a Netatmo weather station that measures temperature, humidity and pressure every five minutes or so. I can download that data from my account. The corresponding csv file is attached to this post (Outdoor203_2015.csv). I use Mathematica 10's new SemanticImport tempnetatmopriv = SemanticImport["~/Desktop/Outdoor_20_3_2015.csv"] which givesWe can plot the data: DateListPlot[Lookup[tempnetatmopriv // Normal, {"column2", "column3"}][[4 ;;]], Filling -> Bottom, Epilog -> {{Red, Opacity[0.2], Rectangle[{DateList[{2015, 3, 20, 8, 33, 0}], 6}, {DateList[{2015, 3, 20, 10, 46, 0}], 9.8}]}}, AspectRatio -> 0.25, ImageSize -> Full, FrameLabel -> {"Time", "Temperature degrees centigrade"}, LabelStyle -> Directive[Bold, Medium]] The red box marks the time of the eclipse and a clear drop in temperature shows the reduced solar irradiation. Of course, this could be a very localised event, and is not sufficient evidence for an effect of the eclipse. I therefore use the public data feature of Netatmo, which I described in an earlier post. The main idea is that I can use an API to get data from many stations in all Europe. I used the same netatmo.sh file descried in the earlier post, and wrapped in into a RunScheduledTask: m = 1; RunScheduledTask[Run["~/Desktop/netatmo.sh"]; data = Import["https://api.netatmo.net/api/getpublicdata?access_token=" <> Last[StringSplit[Import["~/Desktop/request-token.txt", "CSV"][[1, 1]], "\""]] <> "&lat_ne=59.91&lon_ne=13.75&lat_sw=40.42&lon_sw=-20.0&filter=\ True", "Text"]; tab = Quiet[Select[Select[Table[ToExpression /@ Flatten[StringSplit[#, "]"] & /@ StringSplit[#, "["] & /@ If[Length[ StringSplit[StringSplit[data, "place"][[k]], ","]] > 12, Drop[StringSplit[StringSplit[data, "place"][[k]], ","], {5}], StringSplit[StringSplit[data, "place"][[k]], ","]]][[{2, 3, 7, 8, 15}]], {k, 2, Length[StringSplit[data, "place"]]}], Length[Cases[Flatten[#], $Failed]] == 0 & ], Length[#] == 5 &]]; Export["~/Desktop/Databin/DataTemp" <> ToString[1000 + m] <> ".csv", {DateString[], tab}]; m++, 900] This function already cleans the data a wee bit and exports it to a data folder. I did this locally, but of course the new Wolfram Databin is a very useful feature here, too. Anyway, I can now Import all data: Monitor[netatmoeurope = Table[Import["~/Desktop/Databin/DataTemp" <> ToString[1000 + m] <> ".csv"], {m, 1, 91, 2}];, m] and plot it into individual frames: Monitor[For[m = 1, m <= 46, m++, datanetatmo = {netatmoeurope[[m, 1]], ToExpression@netatmoeurope[[m, 2]]}; Export["~/Desktop/SolarEclipseFrames/frame" <> ToString[1000 + m] <>".gif", GeoRegionValuePlot[GeoPosition[{#[[2]], #[[1]]}] -> #[[3]] & /@ datanetatmo[[2]], PlotRange -> {-7, 30}, ColorFunction -> "TemperatureMap", ImageSize -> Full, Epilog -> Inset["Time: " <> datanetatmo[[1]], {-0.3, -0.3}, BaseStyle -> Directive[Large, Bold]]]]], m] I am on a Mac and use the command convert -delay 35 -loop 0 frame*.gif animated.gif to generate the following animation:It is a bit difficult to see but around frame 20 the temperature drops a bit, particularly in the nordic countries. This is actually easier to see when we look at a differentiated time series. The following plot represent the temperature change in several location in Europe: Monitor[netatmoeuropedata = Table[Import["~/Desktop/Databin/DataTemp" <> ToString[1000 + m] <> ".csv"], {m, 1, 91, 2}];, m] and then ArrayReshape[Table[Differences@(ToExpression@#[[2, n]] & /@ Delete[netatmoeuropedata, 14])[[All, 3]] // ListLinePlot, {n, 1, 9}], {3, 3}] // TableForm It is not difficult to see that there is a general drop of the temperature around frame 20. In fact, I can average over all stations: ListLinePlot[Mean /@ Transpose[ Table[Differences@(ToExpression@#[[2, n]] & /@ Delete[netatmoeuropedata, 14])[[All, 3]], {n, 1, 1200}]], Frame -> True, FrameLabel -> {"Frame", "Temperature Difference"}, Epilog -> {{Red, Opacity[0.2], Rectangle[{18, -0.9}, {21, 1}]}}] and I see again a marked drop in (the increase of the) temperature during the eclipse:This is more difficult to see than in the case of my own Netatmo station, because (i) my station measure every 5 minutes whereas the public data is only updated every 30 minutes, and (ii) because I live in a place where the solar eclipse was about 94% and when I average over all European stations that includes countries with much lower percentages. So then there remains the question of the power grid. It turns out that in the UK there is a portal which gives access to a wealth of data from the power grid: http://www.gridwatch.templar.co.ukYou can download data in csv format (see attached to this post "gridwatch-4.csv"). The data is measured in 5 minutes intervals. SemanticImport does the trick: data = SemanticImport["~/Desktop/gridwatch-4.csv"] Now we can look at the data. The frequency of the AC should optimally be at around 50Hz. If the system is strained the frequency drops. The system is carefully controlled to maintain that frequency-let's see whether we can find evidence for control. We can plot the frequency during the solar eclipse: DateListPlot[Lookup[data // Normal, {"timestamp", "frequency"}], Epilog -> {{Red, Opacity[0.2], Rectangle[{DateList[{2015, 3, 20, 8, 33, 0}], 49.87}, {DateList[{2015, 3, 20, 10, 46, 0}], 50.22}]}}, AspectRatio -> 0.25, FrameLabel -> {"Time", "Frequency Hz"}, LabelStyle -> Directive[Bold, Medium], Filling -> Bottom] So, nothing much there. The frequency was quite constant. Obviously, the system was quite under control in spite of the Photovoltaic power "cut". This is interesting, because I heard that there is a job in a national grid company here, where someone has to watch a popular program "Eastenders"; because when the program finishes, people apparently "put the kettle on", i.e. make tea, and that increases the power consumption. That person then has to add every from other countries, e.g. France, and/or hydroelectric power from Scotland to keep the network stable. Hydroelectric power is very convenient here, because it is easy to "switch on and off". This is different from other sources like nuclear power plants and wind power. So let's have a look at the hydroelectric power production: DateListPlot[Lookup[data // Normal, {"timestamp", "hydro"}], Epilog -> {{Red, Opacity[0.2], Rectangle[{DateList[{2015, 3, 20, 8, 33, 0}], 0}, {DateList[{2015, 3, 20, 10, 46, 0}], 850}]}}, AspectRatio -> 0.25, FrameLabel -> {"Time", "Engergy MW"}, LabelStyle -> Directive[Bold, Medium], Filling -> Bottom] Hmm, that looks as if somebody switched something on here. It might perhaps be a sign of active control. I could not find any evidence for a change in the other energy sources such as wind and nuclear. The power demand went a bit up during that time: DateListPlot[Lookup[data // Normal, {"timestamp", "demand"}], Epilog -> {{Red, Opacity[0.2], Rectangle[{DateList[{2015, 3, 20, 8, 33, 0}], 29000}, {DateList[{2015, 3, 20, 10, 46, 0}], 50000}]}}, AspectRatio -> 0.25, FrameLabel -> {"Time", "Engergy MW"}, LabelStyle -> Directive[Bold, Medium], Filling -> Bottom] I am not quite sure why that might be, but perhaps lot of us "put the kettle on" during and after the event!? All these results are of course quite un-scientific, but it there is no evidence that our power grid was at the risk of a major blackout. But then the insolation in Scotland is quite low and clouds are not unheard of either. Perhaps the reliance on solar power is not large enough to impact the system substantially. There are many more things to try out; we have only used a tiny amount of the available data. All the best from Aberdeen,MarcoPS: The netatmo data can be downloaded as a zip file here. Attachments: Answer 14 Replies Sort By: Posted 7 years ago  I mention all of you and your ideas in this blog: Solar Eclipses from Past to Future, Earth to Jupiter. Thanks for wonderful contributions and fun! Answer Posted 7 years ago  This related discussion seems very interesting to me: Solar eclipses on other planets Answer Posted 7 years ago  Dear Henrik,that looks really great. I was aware that my method was quite flawed. I only cover he crescent with a disk which is by no means ideal. I hoped to be able to use a bit of maths to correct for that. Your Manipulate GUI is really nice! Thanks for sharing.Best wishes,M. Answer Posted 7 years ago  Hi Vitaliy,I have just seen that it won't take much before MMA10 arrives. Cannot wait to try stuff out. I have recently programmed a little miner for twitter. It would be nice to see how the WordCloud does in combination with that.Cheers,M. Answer Posted 7 years ago  To highlight the diversity of the discussion we can use new in Wolfram Language function WordCloud: post = Import["http://community.wolfram.com/groups/-/m/t/463721"]; Row[{ WordCloud[ToUpperCase[DeleteStopwords[post]], ColorFunction -> "RustTones", Background -> Black, ScalingFunctions -> (#^.1 &)], WordCloud[ToUpperCase[DeleteStopwords[post]], ColorFunction -> "RedBlueTones", ScalingFunctions -> (#^.1 &), Background -> ColorData["RedBlueTones"][0], WordOrientation -> {{0, \[Pi]/2}}]}] In the code above there are three handy tricks. First is that option WordOrientation has diverse settings for words' directions. Second is that a good choice ScalingFunctions can grant a good visual a peal, and simple power low I've chosen is often more flexible than logarithmic one. The third trick is subtler. It is the choice of background color to be the "bottom" color of the used ColorFunction. Then not only sizes of the words stress their weights, but also fading out into the background. Answer Posted 7 years ago  Thanks Dr. Kaurov, I took the photos on an iPad, there's no processing done on the pictures. Answer Posted 7 years ago  Hi Marco,inspired by your and Sander's posts on how much the sun was covered during the last eclipse, I also tried to write an evaluation from picture data. For this I use Tanvi's first picture (@Tanvi: Thanks for posting!), because in the second one the sun seem to be "cut" a bit at the edges by clouds. My ansatz is to use a model of the eclipse consisting of two equal sized disks overlapping each other. This simple model has five parameters (the number on the right side represents a fitness of match):One can define a "fitness function" to quantify the goodness of match and try an optimization for this. Unfortunately I was not able to get that working perfectly - the result of NMaximize is not bad but by no means perfect. This seems strange because it is very easy to improve the solution by hand.Once a result is obtained, e.g.:one can very easy calculate the covering: \[ScriptCapitalR] = RegionDifference[Disk[{x, y}, r], Disk[{x + dr Cos[\[Phi]], y + dr Sin[\[Phi]]}, r]] /. opt; Print["covering: ", 100 (1 - RegionMeasure[\[ScriptCapitalR]]/(Pi r^2) /. opt), "%"] Otherwise I would not have a quick idea on how to do that - probably some ugly integrals ...I have attached the notebook with the code. The next solar eclipse may come!Henrik Attachments: Answer Posted 7 years ago  Tanvi, thank you for posting! Did you or someone you know took these photos? If yes, do you happen to know with what camera and whether the photos were post-processed? Answer Posted 7 years ago  Hi Vitaliy,i guess it might become important for the global warming discussion one day. The good thing is that there is very good data coverage in some areas. I think two of the main problems are: (i) there is basically no data at the arctic and antarctic; these regions are very important for climate models. Of course you can use Mathematica to do eg. Kriging to estimate the temperatures in these remote regions - or use additional satellite data. (ii) climate change is a rather long term process. It is not weather change, but is supposed to take place over long periods of time. So these private weather stations would have to be operational for many years, much longer than the product exists. In principle it should be possible to monitor (data mine with the help of DataDrop?) some extreme weather events, e.g. pressure profiles during a Hurricane etc. There is something else that might be interesting. It is way off the original topic of the post but it is meant as a side remark, so I'll just post it here. The people from Netatmo used the indoor sound measurements, which are not publicly available, to analyse the sound levels during the football/soccer world championships to find out which fans are most enthusiastic. This was actually a quite nice addition to the fantastic blog posts (first / second) on the Wolfram blog. All three studies use data one is from a crowed sourced database. For my own data at home I can do something similar: data = SemanticImport["~/Desktop/Indoor_25_3_2015.csv"]; If you now plot the noise levels (column6) you get DateListPlot[Transpose[{Lookup[data // Normal, "column2"][[23 ;;]], MovingAverage[Lookup[data // Normal, "column6"][[4 ;;]], 20] // N}], FrameLabel -> {"Time", "Noise level dB"}, LabelStyle -> Directive[Bold, Medium]] This is the noise level at my home. You see that something changed at the end of August last year, which was when my daughter was born. Before Christmas, when the noise level went down because we travelled, the noise levels went up; this agrees with her getting unsettled before Christmas. After coming back from the Christmas holidays we came back and left for another week or so as you can clearly see from the data. It is actually quite interesting to see that there is an interesting pattern developing towards the end of the time series. DateListPlot[Transpose[{Lookup[data // Normal, "column2"][[23 ;;]], MovingAverage[Lookup[data // Normal, "column6"][[4 ;;]], 20] // N}][[-500 ;;]], FrameLabel -> {"Time", "Noise level dB"}, LabelStyle -> Directive[Bold, Medium], AspectRatio -> 1/3,Filling-> Bottom] The high frequency component corresponds to days, i.e. there is a daily cycle of the noise levels. Interestingly, there is also a longer 10-11 day cycle. I have absolutely no idea where that comes from. Any hints are welcome! Cheers,Marco Answer Posted 7 years ago  Hi Tanvi and Vitaliy,I know that this does not work quite well yet, but if I take the photo that Tanvi posted img = Import["~/Desktop/Eclipse.jpg"] and crop it a bit: img2 = ImageTrim[img, {{447.386, 409.278}, {513.381, 344.818}}] I get:The question that also relates to Sander's question in Vitaliy's post is: how much of the sun was covered. The idea is to cover the crescent by a disk: MorphologicalComponents[Binarize[img2, 0.4], Method -> "BoundingDisk"] // Colorize then determine the shape of the crescent itself: MorphologicalComponents[Binarize[img2, 0.4]] // Colorize and then calculate the respective areas: full = ComponentMeasurements[MorphologicalComponents[Binarize[img2, 0.4], Method -> "BoundingDisk"], "Area"][[1, 2]] (*1206.5*) crescent = ComponentMeasurements[MorphologicalComponents[Binarize[img2, 0.4]] // Colorize, "Area"][[1, 2]] (*311.75*) The ratio of which is: crescent/full This gives a covering of about$74\%\$. This number is only a rough estimate. The fitted disk is not quite right as you can see in this plot: ImageMultiply[img2, ColorNegate@(MorphologicalPerimeter[MorphologicalComponents[Binarize[img2, 0.4], Method -> "BoundingDisk"], 0.2] // Colorize)] The black circle is the estimate of the shape of the full sun, which is obviously not quite what we want.Cheers,M.PS: Tanvi, thanks a lot for posting the photos!
Posted 7 years ago
 Thanks you for a very interesting analysis, professor Thiel. Here are a few pictures of the eclipse from Aberdeen, around 9:25 am.
Posted 7 years ago
 Marco, thank you for such extended reply! Do you think the types of data you were mining here have significance for the global warming discussion? Have you or your team ever done research in that direction?
 Hi Vitaliy,nice to hear from you. Yes I read your posts with great interest. The graphic you posted is really nice. It is very interesting to see the area of the different counties and how that depends on the GeoPosition, i.e. the further east the smaller. I found this instructive: AbsoluteTiming[ countydata = {CommonName[#], EntityValue[#, EntityProperty["AdministrativeDivision", "Area"]], EntityValue[#, EntityProperty["AdministrativeDivision", "Position"]], EntityValue[#, EntityProperty["AdministrativeDivision", "Population"]], EntityValue[#, EntityProperty["AdministrativeDivision", "PopulationDensity"]]} & /@ usco;] to get some data on the counties and then this BubbleChart: BubbleChart[{Last@Flatten[List @@ #[[3]]], Log@(QuantityMagnitude[#[[2]]]), QuantityMagnitude[#[[4]]]} & /@ countydata, Epilog -> {Line[{{-104, 3.6}, {-104, 9.7}}], Line[{{-128, 7}, {-75, 7}}]}, FrameLabel -> {"Longitute, i.e. West to East", "Log[Area of County]"}, LabelStyle -> Directive[Bold, Medium]] The x-axis is the position of the Counties from West to East. The y-axis shows the area of the counties and the size of the bubble represents the population in the County. The black lines are to guide the eye. The graph shows the large western Counties and the small eastern ones (in terms of surface/area). If we compare the counties east of -104 degrees longitude, we see that there are: Total@(QuantityMagnitude@Select[countydata, Last@Flatten[List @@ #[[3]]] > -104 &][[All, 4]]) (*14755803*) nearly 15 million people. In the west of -104 degrees, there are only Total@(QuantityMagnitude@Select[countydata, Last@Flatten[List @@ #[[3]]] < -104 &][[All, 4]]) (*2228140*) 2.2 million people. So if the pleasure of experiencing such an event is roughly proportional to the number of people, there will be much more pleasure in the east. In other words the joy will increase over time...Sorry for this digression. Yes, Aberdeen was quite close to the zone of totality on 20 March. Unfortunately, the sky was not as clear as the station at the airport (a bit outside of Aberdeen) suggests. I was teaching/preparing to teach at the time, but here are some pictures that a colleague of mine took:andEven image processing cannot get those pictures clear. I have, however, seen some photos that students took that showed more. I will ask them to post some of the photos. You also said that you hope that people in the US then will have the Netatmo devices. In fact, there are lots of them everywhere in the US. Here are the ones for Champaign IL:This is the cool stuff about these crowed-sourced devices. They are everywhere and the data can be quite cool. Here is a project in Japan where they use Geiger counters everywhere to track radioactivity levels:It appears that one of the advantages of Mathematica is that, because of its ability to import all sorts of data, it can help to combine different publicly available data sources even if the data comes in very different forms. This is a nice step towards Conrad Wolfram's idea of How Citizen Computing Changes Democracy.Data is everywhere and it's fun to work with it. With all the connected devices and technologies such as DataDrop the amount of data that will be collected in the future will be enormous. I am sure that in 2017 we will see great data collected during the eclipse. For those who are already looking forward to it, here are some more impressions from this year's eclipse. Cheers,Marco