Real-time analytics is a hot topic, especially since Facebook published how they designed and implemented their real-time analytics system.Â Facebook posited several design assumptions that centred on the reliability of in-memory systems and database neutrality that affected what they did, including transactional memory is unreliable, and HBase as the only targeted data store.Â Can we challenge these assumptions? Reliable transactional memory exists in the field, it is a requirement for any in-memory data grid; and there are certainly more databases than only HBase. Given database and platform neutrality, and reliable transactional memory – what kind of real-time analytics system could be created?
Real-time analytics are becoming part of mainstream system design, with high-profile companies such as Facebook sharing their design and implementation processes, proving that real-time is already a reality.
However, most of these designs rest on assumptions that inherently limit the resulting systems, among them the idea that memory is unreliable, and that there is only one choice of database.
This paper examines the proposition that these assumptions should be challenged, and that by changing them, inherent limitations of real-time analytics systems can be eliminated.
Download the full report from GigaSpaces and read further about the potential for improvement in real time analytics:Â Download Real-Time Analytics for Big Data