Spark and combining different modules

Spark is the new framework of the moment for distributed computing. Unlike other earlier distributed computing frameworks (Hadoop), here one framework allows us to deal with different types of cases and not have to depend on other projects to provide a solution.

27 January 2016
Spark and combining different modules

Summary of the event

Spark is the new framework of the moment – with apologies to Flink – for distributed computing. One of its most popular features is that it is ready for “everything”, i.e. unlike other earlier distributed computing frameworks (Hadoop), here one framework allows us to deal with different types of cases and not have to depend on other projects to provide a solution.

This attractive feature may raise some questions: Can a SparkSQL process be combined with another process launched by Spark Core? Can I use my Batch processes in a Streaming process? How can I apply what is learned in an MlLib algorithm to a real time logic

Jorge López­-Malla will try to answer these questions in an engaging talk with live handling of cases from real projects that have come up against such problems.

Some time will be reserved at the end of the talk to answer questions about what has been said and about Spark in general.

**A minimum knowledge of Spark is required, as no explanation will be given of how the framework itself works, but no complex algorithmic work will be done.

Share

Share in:

Date

27 January 2016

Location

BBVA Innovation Center, Plaza de Santa Bárbara, 2 28004 Madrid (Spain)

See map
Language

Spanish

Place of the event

BBVA Innovation Center, Plaza de Santa Bárbara, 2 28004 Madrid (Spain) More info

It may interest you