Climate data is experiencing rapid growth, with providers using less sustainable HPC facilities. Are we now in a transition toward leaner downstream products using cloud processing?
Weather Logistics attended the ‘Data Sciences for Climate and Environment‘ conference in March 2018. Organised by the Turing Gateway to Mathematics, speakers in attendance included NASA JPL and the UK Meteorological Office. High resolution modelling is now producing valuable insights into our future climate system. Data volume and processing costs are quartic to the x-y-z volume dimensions, and time domain. My view is that we’ve reached a point where doubling the resolution does not justify a sixteen fold commercial value.
Open weather data describing local daily conditions is now accessible to all. Flood forecasting challenges can now be solved with machine learning using gridded rainfall and drought products on a sub-5km grid scale. Long-term numerical weather prediction data is available from international centres, funded by the European Commission in partnership with the European Space Agency and ECMWF – the world’s leading weather centre.
Our future problem is the spirally costs of computing processing, which makes these free services very costly to sustain for clients. Copernicus, the European programme for environmental monitoring will cost the public 3.3bn EUR over the six years to 2020. This cost is expected to rise further with more users, more data downloads, and as portals contain higher resolution and more extended records of satellite data.
Like a natural resource data is becoming more costly to produce and process, consuming vast amounts of electricity, and open and free services require start-ups and users to generate very profitable services to justify their continuation from 2020 and beyond. Weather Logistics solves some of these problems by running more optimised and computationally less demanding statistical models. This drastically lowers the cost of our weather data for our clients.More