How to innovate in the Age of Big Data presented by Stephen Brobst

stephen3

Executive Summer Session

Stephen Brobst will be at the European Data Innovation Hub. We asked him to share his views on the importance of open data, open source, analytics in the cloud and data science. Stephen is on the forefront of the technology and we can’t wait to hear what is happening in the Silicon Valley. Count on it that you will leave the workshop inspired and weaponed with some actionable ideas that will help us to define a profitable strategy for the data science teams.

Format of the session :

  • 15:00 – Keynote:How to innovate in the Age of Big Data
  • 15:50 – Open Discussion on “Sustainable Strategies for Data Science, tackling following topics:
  • Data Science is the Key to Business Success
  • Three Critical Technologies Necessary for Big Data Exploitation
  • How to Innovate in the Age of Big Data
  • 16:45 – Networking Session

Stephen Brobst is the Chief Technology Officer for Teradata Corporation.  Stephen performed his graduate work in Computer Science at the Massachusetts Institute of Technology where his Masters and PhD research focused on high-performance parallel processing. He also completed an MBA with joint course and thesis work at the Harvard Business School and the MIT Sloan School of Management.  Stephen is a TDWI Fellow and has been on the faculty of The Data Warehousing Institute since 1996.  During Barack Obama’s first term he was also appointed to the Presidential Council of Advisors on Science and Technology (PCAST) in the working group on Networking and Information Technology Research and Development (NITRD).  He was recently ranked by ExecRank as the #4 CTO in the United States (behind the CTOs from Amazon.com, Tesla Motors, and Intel) out of a pool of 10,000+ CTOs.

Job – Click@Bike – Senior Data Engineer

Click@Bike  is a promising start-up with European ambitions for the development and distribution of an innovative touristic cycling product-service offer for the hospitality sector. The Company is well capitalised and has extensive international support from the public sector.

To re-enforce its in-house IT development team, the Company is looking to hire a Senior Data Engineer.

About the role: Data Engineer

You will be a Senior Data Engineer responsible for all operational aspects related to the Company’s data, from sourcing through public and commercial tourist channels, over designing a robust schema and data model, to implementing and maintaining the data infrastructure using the latest technologies and best practices, with the aim to provide most current data in a secure, efficient, reliable and scalable manner to support back-end services and front-end user information display services.

The Senior Data Engineer will work together with external data providers. She/he will perform the technical analysis of the specific data interfaces, execute the data translation and integration with the in-house back-end systems by implementing or developing respective Extract, Transform, Load (ETL) solutions, and, together with the Product Manager, roll-out prototypes and final products to customers.

Further to the primary tasks of the Senior Data Engineer, it is the Company’s strategy to broaden the scope towards software engineering tasks in the backend application stack over time to cross-functionalize the IT department.

The Senior Data Engineer reports to the Product Manager; as the first in-house data engineer she/he owns the unique chance to develop from ground zero the Company’s data engineering processes and to manage the Company’s data.

The Senior Data Engineer will establish and lead the data engineering team.

Essential Qualifications

  • Master in IT with 7 years of job related experience
  • Experience as a software engineer, data engineer, data architect or any combination of the roles
  • Software programming skill set and sound knowledge of design patterns
  • Experience in object oriented programming, preferably using Java
  • Deep understanding of database schema design and data structures
  • Experience in data modelling using inter alia entity relationship diagrams, UML
  • Experience with RDBMS: MySQL, PostgreSQL, MS SQL or Oracle
  • Experience with ETL tools, preferably open source, such as Talend, Pentaho
  • Experience with RDBMS spatial extensions
  • Experience with structured data communication: SOAP, REST/XML, JSON
  • Excellent technical communication skills
  • Software modelling, architecture & software services design experience
    • Unified Modelling Language (UML)
    • User stories, use cases
    • System design
    • Functional and technical systems specifications
  • Experience with Back-end and front-end application development
  • Experience in IT project management
  • Knowledge of NoSQL database concepts and their typical use cases
  • Experience in team leadership

Desirable Qualifications

  • Experience in working in an international environment
  • Experience with NoSQL databases such as graph DB (e.g. Neo4j), document DB (e.g. MongoDB) or key-value stores (e.g. Voldemort or CouchDB)
  • Experience with Java Enterprise: J2EE, JSP/JSF, EJB, JDBC, JMS framework
  • Experience with Android development
  • Experience in web development and system setup: Apache/Tomcat, PHP

Office & Development Tools

  • MS Office
    • MS Word, MS Powerpoint: advanced user
    • MS Access, MS Excel: power user
  • Eclipse/Netbeans, Bitbucket/Git/GitHub, Confluence&JIRA,
  • XML Editor, such as XML Spy or Oxygen XML Editor
  • Enterprise Architect
  • PowerDesigner
  • Operating systems: Windows 8.1 or 10, Linux (Ubuntu, Debian, RedHat)

Soft Skills

  • Spirit to work in a start-up environment
  • Good communication skills
  • Good teamwork competencies
  • Sound analytical skills
  • Sound judgement
  • Service and customer oriented

Language Skills

  • We are operating in an international environment; hence a high English proficiency is mandatory.

Additional: fluent in Dutch, French or German, a third language is a plus

Apply:

Make sure that you are a member of the Brussels Data Science Community linkedin group before you apply. Join  here.

Please note that we also manage other vacancies that are not public, if you want us to bring you in contact with them too, just send your CV to datasciencebe@gmail.com .

For further information or to apply for this vacancy, please contact Bart Vandermeeren and include your CV.

Job – Junior Data Scientist

Screenshot 2016-07-01 12.02.02

Are you pursuing a career in data science?

We have a great opportunity for you: an intensive training program combined with interesting job opportunities!

Interested? Check out http://di-academy.com/bootcamp/ follow the link to our datascience survey and send your cv to training@di-academy.com

Once selected, you’ll be invited for the intake event that will take place in Brussels this summer.

Hope to see you there,

Nele & Philippe

Analytics: Lessons Learned from Winston Churchill

chrurchill

I had the pleasure to be invited for lunch by Prof. Baessens earlier this week and we talked about a next meetup subject that could be ‘War and Analytics’. As you might know Bart  is a WWI fanatic and he has already written a nice article on the subject called ‘Analytics: Lessons Learned from Winston Churchill’

here is the article—

Nicolas Glady’s Activities

Activities Overview‎ > ‎Online articles‎ > ‎ Analytics: Lessons Learned from Winston Churchill

Analytics has been around for quite some time now.  Even during World War II, it proved critical for the Allied victory. Some famous examples of allied analytical activities include the decoding of the enigma code, which effectively removed the danger of submarine warfare, and the 3D reconstruction of 2D images shot by gunless Spitfires, which helped Intelligence at RAF Medmenham eliminate the danger of the V1 and V2 and support operation Overlord. Many of the analytical lessons learned at that time are now more relevant than ever, in particular those provided by one of the great victors of WWII, then Prime Minister, Sir Winston Churchill.

The phrase “I only believe in statistics that I doctored myself” is often attributed to him. However, while its wit is certainly typical of the Greatest Briton, it was probably a Nazi Propaganda invention. Even so, can Churchill still teach us something about statistical analyses and Analytics?

 

A good analytical model should satisfy several requirements depending upon the application area and follow a certain process. The CRISP-DM, a leading methodology to conduct data-driven analysis, proposes a structured approach: understand the business, understand the data, prepare the data, design a model, evaluate it, and deploy the solution. The wisdom of the 1953 Nobel Prize for literature can help us better understand this process.

Have an actionable approach: aim at solving a real business issue

Any analytics project should start with a business problem, and then provide a solution. Indeed, Analytics is not a purely technical, statistical or computational exercise, since any analytical model needs to be actionable. For example, a model can allow us to predict future problems like credit card fraud or customer churn rate. Because managers are decision-makers, as are politicians, they need “the ability to foretell what is going to happen tomorrow, next week, next month, and next year… And to have the ability afterwards to explain why it didn’t happen.” In other words, even when the model fails to predict what really happened, its ability to explain the process in an intelligible way is still crucial.

In order to be relevant for businesses, the parties concerned need first to define and qualify a problem before analysis can effectively find a solution. For example, trying to predict what will happen in 10 years or more makes little sense from a practical, day-to-day business perspective: “It is a mistake to look too far ahead. Only one link in the chain of destiny can be handled at a time.”  Understandably, many analytical models in use in the industry have prediction horizons spanning no further than 2-3 years.

Understand the data you have at your disposal

There is a fairly large gap between data and comprehension. Churchill went so far as to argue that “true genius resides in the capacity for evaluation of uncertain, hazardous, and conflicting information.”  Indeed, Big Data is complex and is not a quick-fix solution for most business problems. In fact, it takes time to work through and the big picture might even seem less clear at first. It is the role of the Business Analytics expert to really understand the data and know what sources and variables to select.

Prepare the data

Once a complete overview of the available data has been drafted, the analyst will start preparing the tables for modelling by consolidating different sources, selecting the relevant variables and cleaning the data sets. This is usually a very time-consuming and tedious task, but needs to be done: “If you’re going through hell, keep going.”

Never forget to consider as much past historical information as you can. Typically, when trying to predict future events, using past transactional data is very relevant as most of the predictive power comes from this type of information. “The longer you can look back, the farther you can look forward.”

read more here

New Postgraduate Progamme in Big Data & Analytics at KU Leuven!

bigdata

Nice initiative

It’s probably no news to you: big data is all around us! Every minute, more than 3 million pieces of content are shared on Facebook, 4 million searches are executed by Google, and Amazon makes more than hundred thousand dollars in sales. This flood of data offers tremendous opportunities, though challenges as well. Gartner estimates that only 15% of organizations are able to exploit big data for competitive advantage, a strong indication of the upcoming need for analytical skills and resources.  As the data piles up, managing and analyzing this powerful resource in the best way become critical success factors in creating competitive advantage and strategic leverage.

Due to increasing demand, we are launching a new postgraduate programme: Big Data & Analytics in Business and Management.

This programme is a unique offering as its aim is to bridge the gap between technical concepts and business applications of big data and analytics techniques. The programme will discuss data analytics fundamentals and big data technologies, but also business applications and managerial aspects, such as dealing with team organization, data quality, deployment, valorization concerns and privacy aspects.
The programme is targeted at professionals with a minimum amount of work experience. It consists of 8 half-day classes, organized on Fridays from October 7 until December 2, 2016.

For more information and registrations, please visit our website.

Job – DATA DEVELOPER @ SALESFLARE

Salesflare
Salesflare, the intelligent sales CRM, currently has a very interesting spot available in their team.
If you are interested to work both as a full-stack developer and with data, and if you want to build the sales CRM of the future, read on to see what Salesflare has to offer!
DATA DEVELOPER @ SALESFLARE
As a Data Developer, you will build the next generation of Salesflare, together with the top team and in collaboration with iMinds UGent Data Science Lab.
You will work as a full-stack web developer, developing front to back in JavaScript, using both Node.js and Angular.js in an agile environment.  You will however not just be building new functionality for Salesflare, but also transitioning its entire architecture from a set-up built on MySQL, to a platform that bridges MySQL (for things like user management), Elastic Search (for searching historical event data), Redis (as a short term event cache) and Neo4j (to store the information about the sales teams’s social network).
This new architecture will host an enormous pool of data, acquired from data sources like the email server, calendar system, phones, social media, email & web tracking technology, and more. And this very data will be used to provide intelligent suggestions, like social links, action detection, sentiment alerts, detection of data that needs to be updated, enhanced sales forecasts, …
If you are enthusiastic about working with new technologies, this is the place to be. We offer a salary that is at market level plus stock options, a rewarding experience in a top performing (and fun) team, and lots of responsibility. What we expect: 100% dedication to make Salesflare the best sales CRM in the world.
Interested? Email your CV or LinkedIn/Github profile to jeroen@salesflare.com.

First 100 Days as Head of Data Scientists

download
Trainer: Martine George, PhD.
Duration: 12/5 Pm from 14:00 to 18:00
Contact: martine@mgholistic.com

Price: 350,- excl VAT – Register here


Context

This half day seminar is dedicated to the creation and the development of a team of data scientists.

Imagine that you become Head of a team of Data scientists , what would you do during your first 100 days in this appointment ?

Audience

This module is designed specifically to recently appointed managers, head of data scientists eager to develop their ability to create and grow an effective and sustainable team of data scientists.

Objectives

The objectives of this seminar is to share observations, experiences, some tips and tricks to develop and grow teams of analytics professionals through the different lenses of a data scientist, a senior manager, a talent development director and an executive coach.

Training Content

This seminar will cover 8 topics such as sponsorship, analytics professional talent management, actionability and creating value, internal analytics communities, external partnership, ….

The content of this seminar will be a combination of practical experiences, stories and anecdotes, peers learning and self-reflection.

About the Trainer

Martine George holds an MBA and a PhD in physical sciences. She has over 25 years of professional experience, including 15 spent developing business analytics teams within large organizations from different industries. She is Professor of Management Practice at Solvay Brussels School of Management & Economics and the Talent Development Director of Accenture Chair in Strategic Business Analytics at ESSEC Business School, Paris. Passionate about developing talent and organizations in business analytics, she is also ICF certified coach, facilitator, trainer and consultant in cultural transformation of organizations.

Job – Nike – EHQ (Hilversum)- Supply Chain Analytics Manager.

nike-inc-logo

Become a Part of the NIKE, Inc. Team

NIKE, Inc. does more than outfit the world’s best athletes. It is a place to explore potential, obliterate boundaries and push out the edges of what can be. The company looks for people who can grow, think, dream and create. Its culture thrives by embracing diversity and rewarding imagination. The brand seeks achievers, leaders and visionaries. At Nike, it’s about each person bringing skills and passion to a challenging and constantly evolving game.

Nike Supply Chain experts ensure that every year 900 million pieces of footwear, apparel and equipment arrive at the right destination on time. That’s no easy task. The complex process involves more than 50 distribution centers, a network of thousands of accounts, and more than 100,000 retail stores around the world. Supply Chain professionals from Laakdal, Belgium, to São Paulo, Brasil, make it happen. They constantly push for ways to make Nike’s supply chain faster, more efficient and more responsive to Nike’s always-changing consumer needs.

As a Supply Chain Analytics Manager, you will gain insights from data and presenting recommendations to senior leaders, leading cross-functional projects, implementing capabilities such as data visualization, predictive analytics, and exception-based analysis, and building an analytics platform across the company.

You will collaborate with a team of cross-functional partners and analytics professionals to identify, scope, and execute high-impact analytics projects across the Europe Supply Chain organization. You will develop & communicate actionable insights through problem definition and the application of advanced data modeling techniques. You will also play a pivotal role in elevating analytics throughout the organization and in spearheading initiatives that instill the importance of data driven-decisions.

The role is perfect for a team player with strong analytics experience, drive, curiosity, and exceptional communication skills. You know how to rise above the numbers and explain the essential insights to business users and senior leaders, and you instinctively minimize complexity to focus on results. While you rely on data to prove your point, you love to think outside the box and find creative ways around barriers.

Qualifications

  • MBA or Masters in quantitative degree (Statistics, Decision Sciences, Operations Research, Computer Science, Economics, etc.) preferred
  • Strong technical and analytical skillset (Excel, R, etc.)
  • Proven ability to sift through data, identify critical information, develop hypotheses, perform rigorous analyses, and make intuitive recommendations to broader audiences
  • Excellent written and verbal communication skills
  • Prior experience synthesizing and packaging complex analyses and delivering results to non-technical audiences including senior leadership teams
  • Demonstrated ability to use data to influence decision making.

Job ID: 00224760 here is the link

 

Job – Artycs – SENIOR DATA SCIENTIST – Brussels

Artycs

The team of Artycs are active members of the Brussels Data Science Community and their offices are located in the European Data Innovation HUB, an incubator focused on data related startups.

ARTYCS is a start-up providing advisory services on Big Data strategy, full end-to-end management and delivery of data analytics projects and sourcing of data science related profiles. 

ARTYCS stands for “The Art of Analytics”. This represents our main values i.e. state of the art analytics skills combined with creativity and innovation in a client oriented mindset. 

At ARTYCS, we invest in Research & Development to work with the latest methods and technologies.

Being part of ARTYCS is the will to progress, to be an entrepreneur and to grow as an individual and as a company.

Becoming an employee at ARTYCS means working in a dynamic environment where there is room for new ideas and entrepreneurship.

At ARTYCS, we know that being a Data Scientist requires an extended set of key skills. We know also that Data Science is a dynamic field. For this reason, we are committed in developing our staff through an active Data Science Development Program. Concretely, this means that during your career at ARTYCS, you will be provided with a development plan that will clearly state the list of trainings you will follow and their timing, the development points for which you will receive coaching from our senior management on a regular basis, the data science events you will attend. With this program, we aim not only at developing our Data Scientists but also at ensuring that they are at the forefront of Data Science.

For one of our clients, a key player in financial services based in Brussels, we are looking for a Data Scientist. This is an exciting and challenging position as the candidate will join a dynamic, successful and rapidly growing team responsible for handling a large variety of Big Data business use cases. The candidate will be key in delivering the business use cases and expanding the reach of the team.

In this context ARTYCS is looking for a Senior Data Scientist to integrate our DS team, to take part to studies at clients and to be an active resource in the growth of the company:

Job description

Skills

  • Ability to learn quickly, adjust to changes and think outside the box
  • Creativity, curiosity and no non-sense approach
  • Positive attitude, self-motivated and confident
  • Excellent time management and organizational skills with attention to detail
  • Experience working with customers to identify and clarify requirements and ways to meet needs
  • Strong verbal and written communication skills, good customer relationship skills
  • Stay abreast of new tools and technologies to practice up-to-date data analytics strategies

Qualifications

  • Master’s degree in statistics, applied mathematics, or a related quantitative field
  • Strong experience with high-level programming languages such Python & R
  • Knowledge on Java and Scala is an asset
  • Work experience using applied machine learning or statistical modelling; knowledge of algorithms spanning clustering, regression, classification, mixture models, etc…
  • Experience with command line tools, relational databases (i.e. SQL), data visualization (Shiny, and version control (i.e. git)
  • Broad based understanding of the Hadoop ecosystem
  • Experience with Open Source big data technologies like Spark, Pig, Hive, HBase, Kafka, Storm
  • Knowledge of data extraction and ETL is a must
  • Knowledge of financial sector is a plus

Description of the position

  • Job Title: Senior Data Scientist
  • Location: Brussels, Belgium

For this position, we offer an attractive salary package including benefits (e.g. lunch vouchers, phone, company car, etc) and a variable part, depending on the seniority of the candidate. 

Apply:

Make sure that you are a member of the Brussels Data Science Community linkedin group before you apply. Join  here.

Please note that we also manage other vacancies that are not public, if you want us to bring you in contact with them too, just send your CV to datasciencebe@gmail.com .

Interested? Contact Laurent Fayet (0476/79.46.28),  laurent.fayet@artycs.euhttp://www.artycs.eu

More Jobs ?

hidden-jobs1

Click here for more Data related job offers.
Join our community on linkedin and attend our meetups.
Follow our twitter account: @datajobsbe

Improve your skills:

Why don’t you join one of our  #datascience trainings in order to sharpen your skills.

Special rates apply if you are a job seeker.

Check out the full agenda here.

Video channel and e-learning:

Follow the link to subscribe to our video channel.

Join the experts at our Meetups:

Each month we organize a Meetup in Brussels focused on a specific DataScience topic.

Brussels Data Science Meetup

Brussels, BE
1,451 Business & Data Science pro’s

The Brussels Data Science Community:Mission:  Our mission is to educate, inspire and empower scholars and professionals to apply data sciences to address humanity’s grand cha…

Next Meetup

Using Open data to promote Data Innovation

Thursday, Nov 26, 2015, 6:30 PM
86 Attending

Check out this Meetup Group →

Training – Hands-on – Spark Streaming – Brussels – December 1st

spark-streaming-logo_0

In this one-day Spark Streaming course you will learn setting up your very own Spark Streaming applications, and do real-time data processing and analytics. You will start with setting up the data ingestion from HDFS, Kafka, and even Twitter.

Next up you will learn about the benefits from using one integrated framework for both batch and streaming processing. You will combine streaming and historical data, in order to create valuable applications.

You will learn how fault-tolerance is built into Spark Streaming, and might even get a hint of how to combine it with the Spark MLlib machine learning library.

This training evelogont is organised in collaboraton with Oak3 (http://www.oak3.be). The Oak3 Academy is an IT Learning Center providing hands-on, intensive training and coaching to help students develop the skills they need for a successful career as an Information Technology Professional or as a knowledge worker (end-user of software). Our goal is to provide the highest quality training and knowledge transfer that enables a person to start or enhance his or her career as an IT professional or knowledge worker, in a short period of time. We therefore offer knowledge assimilation, facilitate expertise transfer and provide a rewarding learning experience. Our training solutions are designed to help students learn faster, master the latest information technologies and perform smarter.

Prerequisites: Previous experience with programming for Apache Spark in Scala is required.

When: 1 December 2015 from 9:00 AM to 5:00 PM (CET)

Where: European Data Innovation Hub – 23 Vorstlaan Watermaal-Bosvoorde, Brussel 1170
BE

Registration: through Eventbrite