When is losing data not important?
by Hu Yoshida on Jul 6, 2011
As a long time storage person the notion of “thou shalt not lose data!” was ingrained in me.
In my last trip to Chicago, I came across an example of where this is not true. It is in the mach speed world of investment trading. In this world should something happen that takes down the system, these organizations stop their business, close out their positions, and will not go back into the market until they know that everything is fixed. The mentality for this sector is that it’s better to be out of the market than be in a risky position. Apparently speed is so important that these organizations don’t do backups, and they don’t do development tests. If they have a new program they want to try, they run it, and if doesn’t work, they switch back or get out of the market.
I wonder if this may be a trend for certain types of high speed analytics for big data. If you are processing huge amounts of data for decisions that have to be made in split seconds, will you have time for practices like back up and recovering lost data, or development tests?