This tutorial explains how to access Apache Spark SQL data from a Node.js application using DataDirect Apache Spark SQL JDBC driver on a Linux machine/server.
Apache Spark is changing the way Big Data is accessed and processed. While MapReduce was a good implementation for processing and generating large data sets with a parallel, distributed algorithm on a cluster, it was not optimized for interactive data analysis that involves iterative algorithms. Spark was designed to overcome this shortcoming.
As you implement Apache Spark in your organization, we understand that you need ways to connect your Apache Spark to other JDBC applications. Apache Spark SQL allows you to connect with any JDBC data source. We put together a tutorial that explains how you can connect to a Node.js application on Linux using a Spark SQL JDBC driver.
If you are looking to connect to a Node.js ODBC application using a Spark SQL ODBC driver, visit this tutorial.
unzip PROGRESS_DATADIRECT_JDBC_SPARKSQL_x.x.x.zip
java -jar PROGRESS_DATADIRECT_JDBC_INSTALL.jar
Start the Spark shell using the following command, which has been configured to run the Thrift server in single-session mode as I am only going to register the imported data as Temporary table. I am also including a package that can be used to import data from the CSV, as it is not supported natively:
spark-shell --conf spark.sql.hive.thriftServer.singleSession=true --packages com.databricks:spark-csv_2.11:1.4.0
import org.apache.spark.sql._
import org.apache.spark.sql.hive._
import org.apache.spark.sql.hive.thriftserver._
//Read from CSV
val df = sqlContext.read.format("com.databricks.spark.csv").option("inferSchema","true").option("header","true").load("/path/to/InsuranceData.csv")
//Check if CSV was imported successfully
df.printSchema()
df.count()
//Register Temp Table
df.registerTempTable("InsuranceData")
sqlContext.sql("select count(*) from InsuranceData").show()
val hc = sqlContext.asInstanceOf[HiveContext]
HiveThriftServer2.startWithContext(hc)
var JDBC = require('jdbc');
var jinst = require('jdbc/lib/jinst');
var asyncjs = require('async');
if (!jinst.isJvmCreated()) {
jinst.addOption("-Xrs");
jinst.setupClasspath(['./path/to/sparksql.jar']);
}
var config = {
// SparkSQL configuration to your server
url: 'jdbc:datadirect:sparksql://<;
hostname
>:<
port
>;DatabaseName=default',
drivername: 'com.ddtek.jdbc.sparksql.SparkSQLDriver',
minpoolsize: 1,
maxpoolsize: 100,
user: 'username',
password: 'password',
properties: {}
};
var sparksqldb = new JDBC(config);
//initialize
sparksqldb.initialize(function(err) {
if (err) {
console.log(err);
}
});
sparksqldb.reserve(function(err, connObj) {
if (connObj) {
console.log("Using connection: " + connObj.uuid);
var conn = connObj.conn;
// Query the database.
asyncjs.series([
function(callback) {
// Select statement example.
conn.createStatement(function(err, statement) {
if (err) {
callback(err);
} else {
statement.setFetchSize(100, function(err) {
if (err) {
callback(err);
} else {
//Execute a query
statement.executeQuery("SELECT * FROM InsuranceData;",
function(err, resultset) {
if (err) {
callback(err)
} else {
resultset.toObjArray(function(err, results) {
//Printing number of records
if (results.length > 0) {
console.log("Record count: " + results.length);
}
callback(null, resultset);
});
}
});
}
});
}
});
},
], function(err, results) {
// Results can also be processed here.
// Release the connection back to the pool.
sparksqldb.release(connObj, function(err) {
if (err) {
console.log(err.message);
}
});
});
}
});
The DataDirect SparkSQL JDBC driver is the best-in-class certified connectivity solution for Apache Spark SQL. For more information about the driver, visit our Spark SQL JDBC driver page and try it free for 15 days. You can also learn more about our other solutions for Apache and other big data frameworks here. Subscribe to our blog via email or RSS feed for more tutorials.
Try Now
Saikrishna is a DataDirect Developer Evangelist at Progress. Prior to working at Progress, he worked as Software Engineer for 3 years after getting his undergraduate degree, and recently graduated from NC State University with Masters in Computer Science. His interests are in the areas of Data Connectivity, SaaS and Mobile App Development.
Let our experts teach you how to use Sitefinity's best-in-class features to deliver compelling digital experiences.
Learn MoreSubscribe to get all the news, info and tutorials you need to build better business apps and sites