In this video, Robbie Allen from Automated Insights describes how the company’s artificial intelligence platform sifts through large data sets to spot interesting patterns, trends and insights, and then describes those findings in plain English with the tone, personality and variability of a human writer.
Big Data is all the rage, but its true value remains largely undiscovered–hidden by unintelligible masses of data sets. Our patent-pending technology humanizes data by spotting interesting patterns and key insights, and describes those findings in your native language (English, Spanish, etc.). The result is Big Data analysis anyone can understand.
Over at GigaOM, Stacey Higginbotham writes that ARM-based server Startup Calxeda is finding success in the storage market.
As the market for scale out computing, storage and networking changes the demands made on IT equipment, Calxeda and others are seeing an opportunity that may have begun in servers and the cloud computing environment, but certainly isn’t stopping there. No wonder Intel is trying to catch up with chips of its own. So far, it’s recently announced new Atom-based chips haven’t made the cut for most customers I’ve spoken with (the lack of integration of the entworking and processing hardware is a problem), but in 2014 it will have a new, integrated SoC as well. Then, the competition will really get interesting.
In this video, Sharmila Shahani-Mulligan from ClearStory describes how the Startup makes Big Data work for business.
It’s never been easier to find business data. But it’s never been harder to work with it. Rapidly-proliferating amounts of data sit in companies’ data repositories, in the vaults of specialized data providers and in endless streams of user-shared thoughts on the web. Managers and their teams want quick and easy access to this information, but are stymied by the complexity of data sources and the rigidity of current data tools.
Could your Startup use some help from Data Scientists? Kaggle is a new site for hosting and competing in data science competitions.
Kaggle is an innovative solution for statistical/analytics outsourcing. We are the leading platform for predictive modeling competitions. Companies, governments and researchers present datasets and problems – the world’s best data scientists then compete to produce the best solutions. At the end of a competition, the competition host pays prize money in exchange for the intellectual property behind the winning model.
In this video from SC12, Doug Johnson from Aeon Computing describes the company’s innovative Data Oasis technology powered by the Lustre file system.
What advantages does Lustre offer as a foundation for a storage system? Bandwidth. Its performance scales out linearly as the file system scales in build out. The more object servers you have, the more network paths you have, the faster your potential. It is the opposite of a large scale monolithic NFS appliance with one spigot.”
In this video from SC12, Neil Levine from Inktank describes the company’s efforts to commercialize and support the Ceph open source file system. With high reliability and nearly unlimited scalability, Ceph has great potential for Big Data applications as well as an enabling technology for Exascale computing.
In this video, Auren Hoffman from LiveRamp describes how Startups are in a better position to advantage of open source and capitalize on the potential of the Big Data market.
Big data isn’t just for big companies anymore, it’s for everyone says Auren Hoffman, Live Ramp CEO and RapLeaf Chairman. Use the tools that let you know which data informs predictive analysis and which data doesn’t. Customer service is where businesses now need to compete, and big data’s potential for increased personalization makes that possible.
In this video, Samplify CEO Alan Evans presents: APAX: Lowering the Cost of Big Science, Big Data, and Cloud Computing.
Multi-core CPUs are hitting the memory wall,” said Al Wegener, CTO and founder of Samplify. “With each new process node, the number of processor cores on a die can double with Moore’s Law, but the throughput of memory, I/O, and storage fails to keep up with this growth. Hence, the performance of multi-core applications is increasingly memory, I/O, and storage bound. APAX is the only solution that accelerates the throughput DDRx, SAS/SATA, SSD, PCIe, Ethernet, and Infiniband, by up to six times.”
Samplify will demonstrate the APAX profiler and hardware IP at the SC12 conference in booth #4151.
In this video, BonitaSoft CEO Miguel Valdés Faura describes what he sees the Business Process Management (BPM) trends will be for 2013.
Trend 1: BPM will extend the longevity of legacy systems
Enterprises have invested millions of dollars in critical infrastructures like IBM mainframes and SAP systems. The pace of change is only increasing so instead of gutting out these expensive legacy systems, we’re seeing BPM increasingly being used to provide a bridge between disparate systems that have traditionally been siloed.
Trend 2: BPM will help make sense of Big Data
The amount of data is ever increasing. Parsing and making sense of this data will be the next challenge. BPM systems can intelligently organize data based on predetermined rules to make Big Data more useful.
Trend 3: BPM will make inroads into new kinds of processes and services
The rise of new processes and services means another added layer of complexity as well as another opportunity for BPM systems to offer increased efficiency. New services are always being off-loaded to the cloud.
Trend 4: BPM is becoming more of a PaaS
Traditional thinking posits BPM is a SaaS, however, this is not what we’re seeing with our customers. BPM is increasingly becoming a strategic tool in aligning and making business processes more efficient based on hundreds of native connectors that can speak to a myriad of disparate systems that were once siloed.
Over at Small Business Labs, Steve King writes that Insurethebox is a Startup UK insurance company that sells auto insurance by the mile. To assess rates, they install a small tracking computer in each customer’s car that measures how far they drive and how they drive.
This is a fascinating use of big data that raises all sorts of privacy issues. For example, the insurance company knows where your car is at all times. This data could be very interesting to a variety of folks like law enforcement, government agencies, your employer, your spouse/significant other, etc. And it’s likely (definitely in the case of law enforcement and government agencies) these folks and others could – under certain circumstances – gain access.
Over at TechCrunch, Alex Williams writes that VCs can be expected to continue investing in Big Data Startups for years to come.
This week, Splice Machine raised $4 million to develop its SQL Engine for big data apps. MongoHQ raised $6 million for its database as a service. A third startup, Bloomreach, announced $25 million in funding for its big data applications. These three companies provide examples for why the investor community will continue to invest in big data startups for many years to come. All reflect a changing dynamic — the rise of the big data app and the need for a new data infrastructure. These two converging trends now drive funding for a widening number of startups that make data functional inside and outside the enterprise.