A while back I watched with great fascination a webinar presented by UC Berkley amp lab on Spark and Shark. I wanted to spatially enable spark and has been on my todo list for a while.
Spark has “graduated” and has joined the real world as databricks and has raise some serious cash to take on map reduce. Even Cloudera is teaming up with databricks to support Spark.
So it was time for me to bring back that project to the front burner and I posted onto github a project that enables me to invoke a spark job from ArcGIS For Desktop to perform a density analysis on data residing in HDFS. The density calculation is based on a honeycomb style layer that I think produces some pretty neat looking maps. Here is a sample:
Anyway, like usual all the source code can be found here. Have fun and happy new year.
Spark has “graduated” and has joined the real world as databricks and has raise some serious cash to take on map reduce. Even Cloudera is teaming up with databricks to support Spark.
So it was time for me to bring back that project to the front burner and I posted onto github a project that enables me to invoke a spark job from ArcGIS For Desktop to perform a density analysis on data residing in HDFS. The density calculation is based on a honeycomb style layer that I think produces some pretty neat looking maps. Here is a sample:
Anyway, like usual all the source code can be found here. Have fun and happy new year.
1 comment:
Post a Comment