The rapid evolution of big data technology and scientific research in the past few years has changed forever the pursuit of scientific exploration and discovery. Along with traditional experiment and theory, computational modeling and simulation is a third paradigm for science. Its value lies in exploring areas of science in which physical experimentation is unfeasible and insights cannot be revealed analytically, such as in climate modeling, seismology and galaxy formation. More recently, big data has been called the “the fourth paradigm” of science. Big data can be observed, in a real sense, by computers processing it and often by humans reviewing visualizations created from it. In the past, humans had to reduce the data, often using techniques of statistical sampling, to be able to make sense of it. Now, new big data processing techniques will help us make sense of it without traditional reduction. Jim Gray, the late U.S. computer scientist from Microsoft in 2007 described a major shift going on in the area of scientific research as—“fourth paradigm” for scientific exploration and discovery. He predicted that the collection, analysis, and visualization of increasingly large amounts of data would change the very nature of science. One of the goals of big data discussed in the book The Fourth Paradigm1 is to make the scientific record a first-class scientific object. Fast forward to 2015 and we see distinct evidence for how the big data technology stack is facilitating this change. This technology guide is geared toward scientific researchers working at universities and other research institutions (e.g. NASA, JPL, NIH, etc.) who may benefit from learning more about how big data is meaningfully transformative in the way it can be applied to the data collection and analysis part of their projects. Further, we’ll illustrate how Dell big data technology solutions powered by Intel are actively helping scientists who are focused on their data, on their models and on their research results.