Monday, September 2, 2013

How I MapReduced a Neo4j Store w/ Hadoop

7 comments
When exploring very large raw datasets containing massive interconnected networks, it is sometimes helpful to extract your data, or a subset thereof, into a graph database like Neo4j. This allows you to easily explore and visualize networked data to discover meaningful patterns.

When your graph has 100M+ nodes and 1000M+ edges, using the regular Neo4j import tools will make the import very time-intensive (as in many hours to days).

In this talk, I'll show you how we used Hadoop to scale the creation of very large Neo4j databases by distributing the load across a cluster and how we solved problems like creating sequential row ids and position-dependent records using a distributed framework like Hadoop.


7 Responses so far.

sudheer1414 said...

Thanks for providing the best information it's very useful for Hadoop learners.123trainings also provide the best Hdoop Online Training you can see free demo Hadoop Online Training Demo in Hyderabad India

Raju Kumar said...
This comment has been removed by the author.
Raju Kumar said...

topics presented very nice .. I love it.
Thank you for sharing, greeting success always for everything
sas training in hyderabad

Raju Kumar said...

I used to be able to find good info from your content.
SAP APO Online Training

Srinu Vasu said...


This is my first time visit on your site and i have bookmark this for again visit.
SAP GTS Training In Hyderabad

Rakesh S said...

nice posts..
Hadoop training in hyderabad.All the basic and get the full knowledge of hadoop.
hadoop training in hyderabad



Raja B said...

Hadoop training in hyderabad discusses many such things such as tools, resources or news related to hadoop.Nice techno resources available as always,at
hadoop online training

Leave a Reply

 
All Tech News IN © 2011 DheTemplate.com & Main Blogger .