All the talk about Big Data tends to devolve to media & entertainment, biotech, streaming web data and geophysical. But Big Data is arriving in places we don’t often consider.
Such as Formula 1 racing, the rest-of-the-world’s NASCAR, where racing team budgets can top $250 million a year, of which 5% is spent on telemetry. According to an article in the Financial Times:
A modern F1 car is fitted with about 130 sensors, which send enough information to fill several telephone books by the end of a two hour race via a radio aerial fitted to the car.
All teams are required to use software developed by McLaren, which cuts costs and enables the governing body – the FIA – to more easily detect banned devices.
For example, new Pirelli tires this year meant teams had to watch tire wear, grip, temperature under different weather conditions and tracks, relating all that to driver acceleration, braking and steering.
Here’s a print out of data from Monaco last year, from a detailed post on F1 telemetry and data analysis:
At McLaren about 20 of the 47 engineers it takes to races work on the telemetry, with a further 30 or so doing the same thing back at the team’s “mission control” at headquarters. . . .
Teams even run simulations during races to predict expected lap times, which drivers are expected to meet.
Data as competitive advantage
Telemetry got started in the 1980s, which means that veteran teams have decades of data to build their simulations. Newer teams have much less data – and less detailed models.
Speed matters
The FIA limits the number of test days and wind tunnel time to help limit costs and level the playing field. Thus the telemetry – real time race data – is even more valuable, if you can quickly analyze and act on it.
Two-way telemetry – where engineers would make engine adjustments remotely during a race – was tried in the 90s, but finally banned. But imagine that technology applied to the morning commute during icy or wet conditions.
The StorageMojo take
As quantum mechanics suggests, we live in a statistical universe. More data gives us greater resolution, just as larger populations enable new market niches.
F1 racing telemetry suggests what the future holds for the larger automobile market: massive streams of real-time road and automobile data giving millions of automobiles – and maybe even their drivers – traffic smoothing, energy-optimizing analysis and direction.
This relies, of course, on the cost of CPU cycles, bandwidth and storage continuing their downward spiral. As long as that continues Big Data will keep growing exponentially.
Courteous comments welcome, of course.
Hi
let’s count how “big” is the big data.
130 sensors in a car
Suppose each sensor is taking measurement each millisecond. I don’t know if it’s correct or not, but
Suppose each value is stored as a number. Since I’m an Oracle database geek, I assume each vale is a number which is stored in 10 bytes. Plus UNIX timestamp (suppose it goes for each value which is a waste of space actually) and different overheads – 10 bytes
Then for two hours race it would take
130*1000*20*7200 = a mere 17.5 GB of data. A desktop PC can easily process such “big data” in memory. It all doesn’t sound “big” to me.
Timur,
If all the F1 team was looking at was the data for 1 car in 1 race, I’d agree. Most teams field several cars. They correlate the data from the multiple cars in the race with data from testing and simulations, as well as data from previous years at the same track.
Then they are distributing the data to a couple of dozen engineers, who are running the raw numbers through visualization tools and simulations to look for anomalies. When all is said and done I’d expect that each race would require terabytes of data – not a CERN LHC shot, but non-trivial – and there are multiple races.
But the larger point, which could have been clearer, is that F1 innovations not infrequently make their way to broader market. Wealthy crowded countries or cities could mandate vehicle telemetry to monitor and coordinate vehicles and traffic control operations. Make moving, storing and processing the data cheap enough and it will happen.
Robin
As a storage Pro and F1 fan I have to take issue with F1 being the “rest-of-the-world’s NASCAR”. The budgets, technological refinements and barriers to entry (for both teams and drivers) put it in quite a different league entirely.
It is not just the races that they are recording. Qualifying, Friday practise, test sessions. Literally every time those cars move an inch, they are recorded.
I used to share an account manager at Dell with a low-end team. He let a few things slip when he discovered I was an F1 fan (accidentally on purpose?). Suffice to say they were prized-customers, so I hate to think what the likes of McLaren, Red Bull and Ferrari have in the Datacentre.
I heard Varsha and Hobbs talking about this topic during the 2011 season and one commented that the teams often collect more than 27 terabytes of data during a single race. Much of this info is relayed via satellite to the team headquarters where some of these real-time simulations are run. Staggering.
Big Data is “NOT” just big volume, It can be a big in variety, veracity too. I also feel that the driver has sensors attached to him and several other measurements like weather etc ..Its BIG Data