March is a beautiful month to visit Silicon Valley, so why not register and book your trip now to the upcoming GPU Technology Conference (GTC) happening March 17-20, 2015 at the San Jose Convention Center. GTC is the most important event for GPU developers and computational scientists.
Reviews of Big Data events from the editors of insideBIGDATA. Too see all the big data events in your area click on the 'Event Calendar" tab above.
“And, to me, the point of this talk is that the world is changing. There is money going to be made by the bucketful, and it isn’t going to go to Larry Ellison and his gang. It’s going to go to smart people like you who know how to leverage open-source.”
We are pleased to offer our readers a free promo code for the IC3 Cloud Conference. The event takes place Oct. 27-28 in San Francisco.
“The Hadoop MapReduce framework grew out of an effort to make it easy to express and parallelize simple computations that were routinely performed at Google. It wasn’t long before libraries, like Apache Mahout, were developed to enable matrix factorization, clustering, regression, and other more complex analyses on Hadoop. Now, many of these libraries and their workloads are migrating to Apache Spark because it supports a wider class of applications than MapReduce and is more appropriate for iterative algorithms, interactive processing, and streaming applications.”
In this special guest feature, Sundeep Sanghavi, explains why attending the upcoming Strata + Hadoop World Conference is important – Oct. 15-17, 2014 in NYC. Sundeep Sanghavi is the CEO and Co-Founder of DataRPM.
“In this talk we summarize the results of the BIG project including analysis of foundational Big Data research technologies, technology and strategy roadmaps to enable business to understand the potential of Big Data technologies across different sectors, together with the necessary collaboration and dissemination infrastructure to link technology suppliers, integrators and leading user organizations.”
“The Hadoop framework has become the most popular open-source solution for Big Data processing. Traditionally, Hadoop communication calls are implemented over sockets and do not deliver best performance on modern clusters with high-performance interconnects. This talk will examine opportunities and challenges in optimizing performance of Hadoop with Remote DMA (RDMA) support, as available with InfiniBand, RoCE (RDMA over Converged Enhanced Ethernet) and other modern interconnects.”
LexisNexis® Risk Solutions has announced its inaugural HPCC Systems Developer Contest. Developers and other technical professionals have the opportunity to demonstrate how they leveraged HPCC Systems to solve either a Big Data or Complex Query problem.
“When organizations operate both Lustre and Apache Hadoop within a shared HPC infrastructure, there is a compelling use case for using Lustre as the file system for Hadoop analytics, as well as HPC storage. Intel Enterprise Edition for Lustre includes an Intel-developed adapter which allows users to run MapReduce applications directly on Lustre. This optimizes the performance of MapReduce operations while delivering faster, more scalable, and easier to manage storage.”
In this video from the 2014 Lustre Administrators and Developers Conference, Brent Gorda from Intel describes how the company is adding enterprise features to the Lustre File System.