Hacker News new | past | comments | ask | show | jobs | submit login
UK Releases 130 Terabytes of Oil and Gas Data (spe.org)
219 points by infodocket on March 25, 2019 | hide | past | favorite | 97 comments



I work/consult for a seismic processing company. 130TB is a drop in the proverbial bucket. Our small shop has multiple petabytes of data in surveys/well bore logs.

If you were looking to modernize an industry O&G is super fit for disruption. There are really only two major players who have awful legacy software. We spent $300k last year to aquire a single seat on a piece of software. We spent another $200k on 2 seats for a year of another piece of software.

These applications are total garbage too.


> If you were looking to modernize an industry O&G is super fit for disruption.

The only way to "modernize" Oil and Gas that is compatible with a future for the planet is to shut it down.


Perhaps for the purposes of energy. There are mydriad petrochem products that have no viable alternative.


No commercially viable alternative when the externalities aren't priced correctly. But hydrocarbons are already made from coal:

https://en.wikipedia.org/wiki/Fischer%E2%80%93Tropsch_proces...

The same process can be used with biochar, or potentially carbon from some future carbon capture method.


This is true and exactly why we shouldn't burn these valuable resources.


Along with all agriculture, medicine, transport, etc.

"Shut it down!" doesn't apply so well to the food supply right?


What‘s you point? If you cannot fix everything, fix nothing instead?


I think their point was shutting down oil and gas is de facto shutting down all of those other fields.


Almost everything you buy has an O&G component to it. With the economic development around the world, at this point it is either O&G or worse alternatives (more costlier and sometimes dirtier)

How do you think goods are transported? How do you think the material is processed?

I agree that humanity needs to find better alternatives, but we all have to be realistic.


> I work/consult for a seismic processing company. 130TB is a drop in the proverbial bucket. Our small shop has multiple petabytes of data in surveys/well bore logs.

The quality of the data is important. Potentially there is a lot of value here, especially for academia who might not have access to this kind of data. You may have multiple Petabytes, but how much of that are you giving away?

I would also argue that 130TB is on the edge of what you can feasibly transfer and store without requiring some kind of complex setup. When you get into Petabytes you're really having to design a unique system just to store and access this data.


When I worked at CGG long ago, we worked with a lot of the proprietary E&P analysis software - it was primarily old UNIX but surprisingly quite capable and scalable.

I think there’s a lot of unknowns in terms of capabilities and algorithms to go after that in that market.

I’d thought of going back into it and developing some front end visualization software - but the amount of secrecy and magic sauce put me off.


One contractor I worked with took their original software designed to run on IBM TSO/JES2 type mainframes and added a rough "GUI" to it. The parameterization of the modules was identical but instead of entering everything into columns and rows like a set of digital punchcards the user could simply fill out a field and the new software would insert that information into the correct row-column. Then they rewrote all of it from the top with a genuine GUI and sweet graphics and we all laughed when the only thing that worked at their initial demo to the processing groups was a band-pass filter. Minor problem for them but a real hoot for us.


What are the two largest applications? What are the applications providing? Are they doing analysis or just providing database interface to all of the survey data?


Compared to Volve this is still far better. I'm just worried it'll be all seismics, RMS projects, and .segy files. We're working on a solution for ingestion of well reports/logs with the Volve reservoir but have precious little examples from the Volve field itself. Here's to hoping this dataset is better!


> There are really only two major players who have awful legacy software. We spent $300k last year to aquire a single seat on a piece of software. We spent another $200k on 2 seats for a year of another piece of software.

What? There many more than two major players and every large contractor has written their own internal processing software. The best tools in their packages are reserved for internal use only, just as the majors did decades ago when the majors operated their own acquisition crews and had in-house processing staff.

Multi-nationals would do turkey-shoots to put new data in the hands of multiple contractors and let them have at it with their best processors and tools partly to see if processing shops had developed better tools but also to target talented processors for their own operations. It's pretty cut-throat out there.

If your company spent $300k on a single seat I would love to know what software they licensed. As an independent contractor for a couple of decades I have been able to license top software packages from top-line processing software companies for under $100k per seat. After you buy that seat you are only paying maintenance in successive years so your costs usually drop to around 20% of the cost of a new license. It covers patches and maintenance and entitles you to new versions on upgrade as long as you are current on your license. You can get a second-tier package for less than $60k + 20% annual maintenance. Some brand new packages I evaluated in the last 5 years debuted below $40k for a package that was full-featured and ready to go from field tapes to final deliverables. Your people must not do any evaluation at all.

The seismic software field is constantly changing.

>These applications are total garbage too.

Haha. I have seen a lot of this in my time in the industry. One thing that chaps most of our asses is that the larger software companies are tuned to the needs of those who hold the most software licenses so small shops are frequently ignored if they request new features, bug fixes, etc. in favor of the software provider adding some new whiz-bang feature for a large license holder.

A lot of the software packages available share the same roots. Several packages that I have personally evaluated are derived from one single public code base with the only real difference being their GUI. One may be more user-friendly, another sucks to have to deal with it but it has all the tools plus some custom gizmorithms, a third is almost a clone of the first but leaves out modules most useful for land or marine data and doesn't allow VSP processing.

I know they are the same code base because I cornered the developers when I noticed errors common to all of them. I had documented a persistent bug (hey, now it's a feature!) in their software while also evaluating other provider's software and in the process found two other packages that would produce the exact same output every time given identical input even though the output was clearly wrong. In one of the packages, even the parameterization screen dialogs were nearly identical. Pretty unimaginative GUI coders for some of this stuff but it is likely because the guys slapping the interface on this kludge don't actually understand the objectives behind what we are doing, they don't understand which bits of information make or break the imaging effort, etc. because they are coders and not geophysicists.

But, overall, large contractors employ some of the brightest minds in geophysics, computer science, physics, mathematics, geology, etc. That is the main reason that we are able to squeeze old data to extract even more geology than we initially could when it was acquired. Algorithms have improved, hardware is up to the task of keeping everything straight as it gets hammered through the flows.


For seismic work as far as I can tell almost everyone is using just Petrel and Paradigm as their core day to day tools. H&R has some stuff but really if you look at people working in the field you'll see Petrel experience is the most common trait.

Yes every shop has internal tools but again much of this is stuff cobbled together by non-software developers. I remember at one point a major tool at this one shop stopped working. It turns out they had hard coded a network share as a temp folder and the folder got removed.

Out $300k spend was on a completely loaded out license from a company that rhymes with lumberge. I don't want to put the name since there was a pretty strict NDA infront.


Thanks for the additional information. I believe you were probably listing prices for interpretation software packages like Petrel. Processing software is an entirely different animal and that is what I was talking about in my reply. You have to process the data so that it can be interpreted.

Your big packages for interpretation are Petrel, Paradigm's Epos systems, and IHS Kingdom-SMT. They are all great packages but as you noted there are things about each one that the user will stumble into which end up making no sense and likely result from poorly coded features which should have been upgrades but ended up being kludges.

> rhymes with lumberge

Scumbagger? As a former employee I can tell you that the best day of my life outside of my marriage and the births of my kids was the day I opened my mail to find an offer letter from another company - which I promptly accepted. Scumbagger bought a great processing software provider a few years ago. Their software was full-featured and very user-friendly. It had some quirks and a lot of kludges but their software support was top-notch the best I have seen in the industry. After the buyout, the older support hands were laid off and the support was bureaucratized to the point where it was no longer worth it to report a bug or request support. A true shame. Once a company gets that large they become like that old saying about juggies on the field crews - if they can't fuck it up they shit on it.


second the 'bright minds' contractors .. physics-trained coder at US top-5 university, contracted to do 3D modelling of sound-based exploratory data, with accurately modeled earth materials, in the software. So the strata of the earth materials for a real place, on the one side, and the behavior of the sensors on the other. This was in the 2007 era


I hope this position was a great stepping stone in your career and helped you reach your goals. Complex imaging problems require the skills of so many people trained in different disciplines.


It would take a ton of cash to disrupt O&G. Hundreds of millions of $. In a notoriously volatile market that may or may not be dying / declining.


You're not completely wrong but O&G don't really hire all-star software developers. Most of the software is complete crudware with asp/dated mssql back ends and just buckets of bad and legacy code.

I've also seen a lot of geo/petro phsycists/processors intentionally not want to help because they know heavy automation endangers their job.


Is this really the case? Can you give us some use cases where software could increase productivity in o&g that isn't being done already?

One could argue that existing software even if legacy works and gets the job done so what is to (significantly) gain from writing new (and improved) software?


I think the biggest opportunity in this industry is automation of the survey process (as a whole). It currently requires big expensive ships with big expensive sensors with lots of manual data processing from people getting reasonable wages because they have to stare at a computer screen in some fairly nauseating conditions. The big ships tend to send companies broke fairly quickly when there is a downturn in the market (i.e. no exploration). An autonomous survey vehicle has the potential to massively reduce the survey costs (as long as it was reliable).


> lots of manual data processing from people getting reasonable wages because they have to stare at a computer screen in some fairly nauseating conditions.

Most positions I've seen do not look like they pay that well. You can make more as a processor in a shop on land and work a regular 8-5 job. On the boats you get a 12-hour tour for the duration and the only real perk that might make it worthwhile is the opportunity to visit foreign ports and dawdle during breaks.

Survey automation is a complex task because depending on whether you are a marine crew or a land crew. Some things are easier in marine work due to less cultural constraints (buildings, highways, pipelines, etc.) But in the same way it is easier on land to locate and replace any sensor that fails without losing much data from that receiver location. For best imaging you need to be able to avoid introducing holes in your data coverage and correct anything that causes a data loss. Redundancy is a real thing out there.

The industry has morphed into one where many larger acquisition contractors have divested themselves of the ships needed to acquire the surveys and they contract that now to custom acquisition crews. Everything went bare-bones a rawhides in the last downturn and as we know, seismic exploration is one of the last things to recover after a bust.


From what I've seen is all the survey shops are just bleeding money because A. Surves equipment has a huge and expensive monthly cost. B. Most supermajors aren't ordering new surveys like they used to, instead they're choosing to have old data re-proccessed.


Acquisition is always the last thing to recover after a bust. There is so much legacy data around for reprocessing that all they need do is find someone with data in their prospective area and have it reprocessed using the latest imaging tools. A lot cheaper than acquiring new data.

Also, there is a shift in the industry from ownership of the survey equipment (sensors, recording systems, etc.) to rental of everything. Manufacturers build it all, rent it out for custom surveys, maintain it and service it all, train the equipment operators, etc. That cuts costs and makes acquisition a matter of retaining trained personnel for key positions and recruiting trainable people for the rest.


Most of the advancement that leads to actual profit increase is in analytics right now. Getting more out of data before investing heavily.


Labor is one thing but I'm really talking about how expensive it would be to get access to dozens of live rigs and tons of data.


At those prices, how big is that total market? Tens of millions?


A number I saw a few years ago, is that the just the plugin market for Petrel is estimated at 4.5 billion USD annually.

Unfortunately I have no way to verify the source.


I'm totally in favour of open data, of course, but there are ethical issues when the data is explicitly intended to promote "exploration activity on the UK Continental Shelf, ultimately boosting recovery".

Potentially increased extraction of fossil fuels is incompatible with the UK's climate obligations and not something that should be celebrated.


Don't write off gas yet. Switching from coal to natural gas allowed the UK to cut CO2 emissions dramatically in a short period of time and have resulted remarkably less carbon intensive electricity generation compared to countries that have spent massive amounts on wind and solar like Denmark and Germany. Also the potential for carbon capture and storage for gas power generation is good and could see it remain a power source even in a net zero CO2 emission future.


The UK has also invested heavily in wind and solar. In fact, the UK is the world leader in offshore wind energy!

Much of the UK's reduction in grid carbon emissions since around 2010 is due to a coal -> renewables shift, rather than just coal -> gas. In fact, even gas-fired electricity production has begun to decline in the UK as more wind capacity comes online.

Total low-carbon (renewables + nuclear) production reached 56% share in 2018.


Unfortunately people are a bit mislead by the headlines on this, you'll get news saying that on some days renewables became a significant part of the mix (for that day), or that renewables are a significant portion of new capacity (since no new gas is installed because the gas generating capacity already exceeds max potential consumption) or that renewables is now a significant portion of all installed capacity (capacity but not generating or being used). The reality is that for what is actually generated and consumed, it's mostly gas. Have a look at this regularly, https://www.electricitymap.org/?page=country&solar=false&rem... and you'll see right now as I'm typing it we have 15% nuclear, 10% wind and 49% gas.

Around Europe we can see the countries that have low emissions are either heavy on nuclear or have good access to hydro, or both. The rest that are doing well are on gas. Those who shunned gas and nuclear and went all in with wind and solar are usually amongst the worst. We can expand renewables as much as we like but it's not going to be enough until we have a scalable way of storing the energy generated from it. If we want to reduce gas in order to cut emissions even further, it's going to have to be nuclear. However, if we're talking about the cost of nuclear, the cost of carbon capture and storage of gas generation also becomes an option.


> "you'll get news saying that on some days renewables became a significant part of the mix (for that day)"

Yes, there is significant day-to-day variance in renewables production. But renewables are a very significant and rapidly growing energy source in the UK, reaching 30% of total grid production in 2018, while both gas and coal are in decline.

In fact, just wind + solar has already exceeded the combined annual generation from the UK's entire nuclear fleet. And it's likely that in the next 1-2 years, wind alone will exceed nuclear production.

Daily variance is also declining over time as the wind turbine fleet becomes more geographically dispersed.

It's true that if the UK had never built gas power plants and still relied extensively on coal, then we'd be in a much worse situation. But if we had gone for gas alone, emissions would be far higher than with a gas+renewables mix. And energy security would be worse, leaving the UK vulnerable to fluctuating prices and potential gas shortages and supply interruptions as most natural gas is imported.

> "Those who shunned gas and nuclear and went all in with wind and solar are usually amongst the worst."

Well, we all know about Germany. The problem here is that not only are they relying on coal, but much of it is actually lignite (brown coal) - the dirtiest form of coal.

And much of the reason they are still so dependent on fossil fuels is not because of lack of renewables production, but because of transmission constraints between the north (where most of the wind production is) and the south (where the biggest demand centers are). This issue is being resolved over time.

You also seem to be ignoring countries like Denmark, Spain, and Portugal who have very successfully moved to wind and solar and now have very little dependence on coal.


Well, I must admit that I was surprised by that 30% production figure. That came fast, hopefully it's possible to keep up the growth in that.


Completely, I found out yesterday that oil and gas companies only pay a fraction of the corporation tax that normal companies pay in the UK! We have as a society very strange priorities or maybe the people who assume power are just completely self interested.


And the Government was complaining about giving a lot of millions to the EU in prejudice of making their health care system worse than it was.


There's a really nice fictional drama miniseries from 2018 about the discovery of oil in Norway: "Lykkeland". I assume it's a very stylized version of what actually happened but highlights the conflicting interests a little bit.


Oil companies in the UK are subject to: Ring Fenced Corporation Tax Supplementary Charge Petroleum Revenue Tax.

They are much more heavily taxed than other companies.


There seem to be quite a lot of taxes on them eg

>This means that the marginal tax rate on PRT paying fields is now 81% (fields not paying PRT pay a rate of 62%) https://en.wikipedia.org/wiki/Petroleum_Revenue_Tax


The UK Petroleum Revenue Tax (PRT) was set to 0% on 1 January 2016, and even before that only applied to fields established prior to 1993.

https://www.gov.uk/guidance/oil-gas-and-mining-petroleum-rev...


Yeah, we could have had a sovereign wealth fund - instead we let the corporations exploit our natural resources for free.

I guess the Empire finally colonised itself.


It says in that link that it was effectively abolished in the 2016 bugdet...


In scientific computing, the number 130TB doesn't tell you much. The header figure in the article (https://www.spe.org/media/filer_public/e3/31/e331595a-7594-4...) could be made of 130TB volumetric geometry and scalar field data, (in this case miserably) downsampled before rendering.


The 2016 geophysical data summary and data package details (excel) has a nice map that that gives a bit of context to the region and data. [1]

I just wanted to see some numbers or a nice pdf or two with a few seismic plots; [2] and [3] delivered. Although I'm not so sure about the value of the word clouds [4], the Relinquishment reports are concise with pretty plots.

[1] https://www.ogauthority.co.uk/data-centre/data-downloads-and...

[2] https://www.ogauthority.co.uk/data-centre/interactive-maps-a...

[3] https://data-ogauthority.opendata.arcgis.com/pages/statistic...

[4] gas! https://itportal.ogauthority.co.uk/web_files/gis/images/Word...


And it should all be left exactly where it is. Fossil fuels are killing our people and our environment.


The current government is super keen on getting fracking started.


A lot of data that’s probably already been thoroughly poured over.

If you’re searching for logs on a producing well it might be useful.

But, I somewhat doubt the seismic data has much value if they’re giving it away for free.


I have to disagree strongly. The seismic data is probably at least as valuable as the well data. There is a reason that the acquisition contractors and other data owners don't make regular releases of old, legacy data for the public domain.

It really has an infinite shelf life due to the processor's ability to periodically employ newer, faster algorithms on newer, faster hardware to produce products that, though they frequently show only marginal differences, are still marketable as new products or upgrades over old datasets.

Legacy data, due to the acquisition methods employed decades ago, cannot be replicated today. You will not get a permit to acquire airgun data today using the same energies they routinely used a few decades ago nor will you be able to use a broadband source like dynamite offshore. It was routine back in the 60's and 70's. The bandwidth of data today is different. New source types and improvements to old sources can help but the old data has tremendous value as a calibration. The penetration of energy for imaging the deepest events in the subsurface is so much better in old data due to the low frequency penetration characteristics of old sources (higher energy sources).

If you look around for free or publicly available seismic data there really isn't much and ten years ago there was almost none that was easy to find. Industry groups hoping to help newcomers learn by processing raw field data have always been beggars to the data holders. Licensing restrictions follow data everywhere and a lot of it comes with tight constraints on how it can be used and whether it can be published.

Most contractors hold tightly to their data because it doesn't matter how old it is, you can always squeeze it through another processing flow and output a brand new, improved product and offer that for sale to your existing and prospective clients. Old surveys get new names, they are merged with new data using match filters and cross-correlations and tied so that it is not possible to tell where the old data coverage ended and the new data began.

I started processing almost 30 years ago. Some of the data we processed then was already 20 years old. It served to help a client decide whether a new survey would offer any value to their exploration efforts by contributing a more detailed subsurface image. 2D was a great reconnaissance tool and still is today. By reprocessing some 2D data a client can focus their 3D efforts on proscpects where the potential for success are highest thus cutting their costs and if you think cost cutting isn't a thing in the oil patch, seismic data processing is a loss leader for the big contractors.

I did some 4D seismic processing which involves acquisition of new data using the exact same parameters and processing flows as were used in the first survey. New data is then compared to older data so that operators can see the tell-tale changes in their reservoirs which indicate migration of fluids in the subsurface during production or fluid invasion during waterfloods or CO2 floods.

Old data never dies nor does it lose its value. Like I mentioned above, new data gets matched to old, old data gets matched to new. Any time a survey is acquired for the first time in an unexplored area, that survey data becomes the ground truth dataset. All future data will be compared to it for quality, bandwitdh, signal to noise ratio, etc.

Geophysics, or seismic data processing, really is a "what do you want it to look like" operation. Once you know the acquisition geometry then you can determine everything you need to know to image the subsurface just by smashing it through enough algorithms to filter out all the geology-related attributes like formation velocities, amplitudes - especially anomalous amplitudes, formation thicknesses and their depths below the surface or the seafloor, fluid content, etc. It really is amazing what you can discover without once touching a rock today. It goes so far beyond imaging subsurface structures. I love this field of work.

I am pretty happy to see seismic data released for personal use. I will be digging through this to see what I can find.


You make some great points. Certainly it’s useful for lots of other cases - I was just considering it from an E&P perspective.

They already have numerous surveys and lines throughout those areas.

My first job out of college I processed and archived several warehouses of data going back to some of the very first analog signals recorded, TI’s first digital tapes (named GSI at the time) and also digitized paper records from the 1920s. Great first job that exposed me to massive data and algorithms!


Thanks. I would've loved to work with that ancient data. I think one of the biggest challenges as a processor is to understand all the information in the observe reports and how it relates to handling things like field source and receiver geometry, noise issues during acquisition, etc. I imagine that the oldest stuff was treated like a science project with everything notable recorded in detail for future reference. I know that by the 1960's field crews had pretty standard notes they would take. Later in the 70's, as activity picked up before the big bust in 81-82 things became almost sloppy so that it took a lot more effort to figure out what actually happened during acquisition.

I worked on field crews back then and one of our observers would fill out his paperwork the night before while he had a few beers and smoked all kinds of things. Then the next day he would just make quick notes if something ended up different. Too bad that when wasn't diligent at modifying his pre-written reports. The prevailing belief on the field crews was that someone would figure it all out in processing later so if they didn't get it right during acquisition we could always fix it. Some of the best projects I worked on as a processor involved unraveling the chains of errors in documentation to improve imaging of old surveys.


Interesting that this follows the Equinor release last June. We've been working on analyzing the Volve Reservoir data, but the data ingestion is getting very difficult because of the non-standard documents and data types in the repository. Should be interesting to see if the UK's data will be the same.


So, does anyone have a torrent for mirroring? I can't find anything but a less than useful Web UI and mostly derivative datasets/alternative formats.


Anyone have any links to help me understand the terms, how they actually gather seismic data etc. I'm interested and would like to learn.


This make generate of a lot of controversy and even deploy more wars.


> infrastructure data

That bit scared me a bit. It is not always known where those massive pipes are through the land. I hope they did leave out the details that would pose security threat/information that can be used in a sabotage.


> It is not always known where those massive pipes are through the land

These pipelines are quite easily identified on-shore by surface markers in the UK[0]. Also, like I did, folks would have noticed the fairly substantial earthworks that went on when they were being installed....and bided their time if they were to interfere with them. I remember the excavations in Scotland for these pipelines back in the late 70's and 80's. There's no big secret about them.

As to offshore, well your "terrorists" are going to need a fairly huge amount of resources to attack them...deep sea divers, vessels large enough to withstand North Sea storms etc.

[0]: https://www.alamy.com/stock-photo-oil-pipeline-marker-with-f...


I have read an account of the SBS storming a North Sea rig as part of an exercise and it wasn't exactly easy for them with all the very best training and gear and supported by Royal Navy ships - and that was in reasonable weather conditions.


Security through obscurity should be long dead by now. If the only thing stopping people from damaging these assets is their 'secret' location, we have a big problem.


Security through obscurity has different merits depending on the context.

In software, it is a really bad idea to hope that your code is so obscure that no attacker will find a security flaw.

IRL, on the other hand, unknown positions of assets are often critical (the uncertainty whether you know about all the adversary's nuclear silos being a huge part of the nuclear deterrence policy). There is just so much resources you can throw at discovering underground pipelines.


Silos are hard to move; secret nuclear launch locations are provided by SSBNs.


This is completely incorrect. Things can be made so difficult that they become impractical by forcing the bad actor to do a lot of grunt work.


Not finding significant pipelines. That's pretty easy if you want to do it.


In some sense, that's what cryptography is.


People like vandals and thieves and vagrants can be deterred by obscurity. There are lots of things which enjoy a modicum of peace and non disturbance because of obscurity. Maintenance sheds, utility rooms, etc.


Security through obscurity is a useful and valid part of a security process, but only if it's one of many parts of the process.


Do they put Fylingdales on maps yet? Though of course it's been well known for decades :)


I did a quick check and RAF Fylingdales is easy to find on Apple and Google maps. However, my OS Maps app won't find it even though I can find other 'normal' RAF bases (e.g. RAF Lossiemouth).


I should be surprised that OS still don't list it, but I think it's almost quaint these days. :)


Well, for doing anything outdoors I'd take OS 1:25000 maps over Google or Apple maps any day (both the app and backup paper versions).


Me too. I meant the continued absence of a huge base and radomes from the OS maps was the quaint relic, expressed poorly. :)


It's more likely that some negligent construction workers accidentally puncture a pipeline than terrorists blowing it up.


Negligence can be factored into insurance costs. Act of war, not so much.


I'm sure plenty of insurance companies would underwrite terrorist events, considering how rare they are compared to other risks.


They're not exactly secret either. Has any terrorist group tried this in a Western country?


And let's be honest with ourselves, if they were trying to destroy things (and not... well... terrorize...) then there are A LOT of sectors that are EXTREMELY vulnerable [0]. It doesn't take much knowledge or money to perform acts that would have extreme economic impacts. But wannacry (or something far worse) is less scary than a person shooting up a night club. Terrorists are trying to strike fear, not economical damage. State actors would be the ones trying to do economic damage, but we're not really at war with people that would do that (on a meaningful scale).

[0] a quick google search will lead you down a huge rabbit hole. Here's a good start: https://www.youtube.com/watch?v=pL9q2lOZ1Fw but there's huge flaws in all these sectors and even in cellular communication. Just watch some Black Hat and Def Con conference talks.


Fair point, not yet.

But I assume that a pipe that is placed 5m underground on an unmarked path crossing a valley, was done so for a security as well as convenience. I will not apply to see that data, but if I was a "Red Team", it would be 'useful' information to have.


They used to have quite the secrets policy in place to cover critical fuel infrastructure, yes.

I'm pretty sure the only reason they're opening this up now, is that it's now become so easy to get ground-penetrating radar satellites into space, that every state actor now knows where all everyone else's pipes are. They can't hide it, so they may as well give up trying.


Does accidental damage count?

https://www.cnn.com/2019/01/18/americas/mexico-gasoline-expl...

Edit: only the ignition was accidental, the puncturing was intentional


And the license is...?


I see the data is published under the Open Government Licence


I know this is not the time or place but Brexit is having some unusual side effects - while the entire political and advisory class of government is distracted for three years, a number of "I am surprised that got through" things have been happening - moves towards fighting tax havens in the City, and things like this ... just ... unusual


We have the unusual position of a government that in any previous parliament would have fallen. The Prime Minister would have resigned on the first defeat and I think we'd already have a new government elected by now. A better government seems unlikely, but probably one with a majority.

Thanks to the badly designed fixed-term parliament act government is effectively stuck there, powerless, breaking records for scale of defeat, and utterly discredited. Thanks once again to Cameron then - for letting the LibDems break something they didn't understand.

The popular has a chance, while the rest is ignored as government fails to consider anything but Brexit, and can't even deliver that.

It would be unbelievable in House of Cards, or Yes, Minister.


I was (unusually for me) not discussing the specifics of Brexit, just that once you take away the politicians usual oversight, many unusual, perhaps radical, positive things seem to happen.

Sadly the lesson is not to get rid of the politicians completely.


In some ways it's more like how politics used to be. Before entire parties were expected to be perfectly on message for every minor policy. Not exactly independent, but a little freedom to act and think.

The message may not be to get rid of politicians, but perhaps that a major weakening of the party system is called for. Getting rid of career politicians might be a good move though.

Course I wouldn't expect any party with realistic chance of government to favour either those or PR. I can hope though... :)


The conservatives have actually been pretty good on Opendata for quite a while. They made lots of mapping and lidar data available.


While our politicians are finger-pointing to save their skins, progress (and capitalism) doesn't stop, it matches on.

I was reading that Shell UK is making a change to the greener, which scares me because big oil is NEVER green friendly unless they want something more sinister.


Oil companies aren't evil, they just care about nothing but the bottom line. Once that is served by moving to green energy, they'll simply shift over at whatever speed makes the most business sense.

Unless it's a tiny PR stunt, R&D project or mandated by regulation, you should instead take it as a sign that for the first time green energy in a free market is competitive at scale in certain situations, even without accounting for externalities. That's fantastic news.


The young'ish CEO of .. Dutch Royal Shell (?) .. one of the majors .. was overtaken by an internal coup and kicked out, for embracing green energy openly.. in the early 2000s. It was quite well documented.


Oil companies do not care more about their bottom line than wind or solar does. The green lobby is just as strong if not stronger relative to its age.

Solar and wind are neither clean nor cheap, but they do have a lot of policial backing.

I know it's not a popular thing to say here it's none the less true.


When comparing wind or solar with oil or gas, in what sense is wind or solar not cleaner than oil or gas?


i am not comparing them as such i am saying that they arent as green as people think.

To actually produce them is one thing, they require huge areas of land to produce, they are unreliable meaningb they still need oil and gas and coal for when they dont work, solar uses rare earth metals, they can only be used for one thing and they currently only produce around 1% of our needs. They are great as supporting sources of energy but they arent actually solutions to our fundamental energy needs. And they arent as cheap as claimed since production, installation and decommissioning normally isnt factored into the cost when you see comparisons.

The greentech industry is every bit as bottomline focused as the oil industry and it is so with an inferior product mostly pushed through by political lobbying not on market terms.


Or more realistically, they're hedging their bets: Business's exist to make money not CO2, Green power is becoming increasingly profitable/in demand


All the big oil companies did a lot of photovoltaic research in the 70's. I think it might have been exxon who were one of the earliest players. I don't think it was sinister then, just a diversification into "energy" companies, especially with the oil crises going on back then.


They're still spending millions lobbying on the wrong side of the argument, so getting greener from a starting point trillions overdrawn. I tend to see any green tinged events as greenwashing.

Still, hopefully I'll be pleasantly surprised a few times.


Shell has a legal obligation to make money for shareholders. Shareholders are divesting from fossil fuel companies because (a) they're too high risk, and (b) shareholders want a greener future. That's all it comes down to.

https://www.theguardian.com/business/2017/nov/28/shell-doubl...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: