1. The hybrid data cloud.
2. Mobility is driving big data investment.
3. Big data can surround and enhance existing applications.
4. Internet of Things will make current big data projects look like small stuff.
5. Big innovation is coming to the front end of the data spectrum.
In a new HuffPost/YouGov poll, only 36 percent of Americans reported having "a lot" of trust that information they get from scientists is accurate and reliable. Fifty-one percent said they trust that information only a little, and another 6 percent said they don't trust it at all. See: http://huff.to/19Joyn5
A new book "Predictive Business Analytics" by Gary Cokins, a Data Science Assocation Advisory Board Member. Gary is a great writer who is an expert with talent for simplifying complex subjects with clarity. Gary is a trusted advisor and I strongly recommend that you purchase this book.
I was recently a presenter in the financial planning and analysis (FP&A) track at an analytics conference where a speaker in one of the customer marketing tracks said something that stimulated my thinking. He said, “Just because something is shiny and new or is now the ‘in’ thing, it doesn’t mean it works for everyone.”
I've been harping on the importance of GPUs since my October, 2012 blog post Supercomputing for $500 and more recently in my reviews here of the SC13 conference. A couple of news stories this month indicate broader recognition of the growing importance of "Big Compute".
More and more frequently we see organizations make the mistake of mixing and confusing team roles on a data science or "big data" project - resulting in over-allocation of responsibilities assigned to data scientists. For example, data scientists are often tasked with the role of data engineer leading to a misallocation of human capital. Here the data scientist wastes precious time and energy finding, organizing, cleaning, sorting and moving data. The solution is adding data engineers, among others, to the data science team.