We had in total 175 people registered for S05E03 meetup of the #datascience community of Brussels. 3 legaltech companies demonstrated how they are using datascience to offer better legal services.
Our speakers explained that the data in the legal industry is mainly words and explained us their approach to process vast amounts of unstructured data through Natural Language Processing techniques (NLP).
The event was held in the clubhouse of the AI & Datascience community at DigitYser on February 7th, 2019. Like always it started with a happy hour at 18:00. All presentations were recorded by Ricardo head of the technical team of DigitYser and made available on our different channels. Youtube – Twitter – Facebook.
Presentation 1 started at 19:00:
How Reacfin proposed to use NLP and deeplearning to check conformity of legal and contractual documents in the re-assurance business.
Reacfinexplained us a use case in the re-insurance business. They used NLP to automate the review of huge contracts. Aurélien Couloumy, head of data science, led us through their step by step approach to tackle this issue using Spacy.
Presentation 2 started at 19:25:
How Jetpack.AI won the #hackBXLaw in November 2018.
The winners of the hackathon on debt recovery, Jetpack.ai, explained their problem solving approach that led them to win the contest and develop their tool: Magpie.
Dodo Chochitaichvili and Gautier Krings explained us how lawyers and data scientists worked together using an agile methodology & KISS principles where the first iteration was the production of a skateboard.
Presentation 3 started at 19:50:
Darts-IP explained how AI combined with high quality data is revolutionising Intellectual Property practices.@
The managing director of Darts-ip, Evrard Van Zuylen, and the data scientist, Vignesh Baskaran, aka Vicky introduced the technology they develop to create a comprehensive database on patents around the globe. They explained the data science process management that lead to their competitive advantage. ”
Join us in London, March 19-20, 2019 for Deep Learning In Finance Summit. Discover advances in deep learning tools and techniques from the world’s leading innovators across industry, research and the financial sector. Learn about deep learning applications in the financial sector from algorithms to forecast financial data, to tools used for data mining & pattern recognition in financial time series, to scaling predictive models, to stock market prediction, to using blockchain technology. Register here.
A jury will make the decision on who to include in the final agenda. As we receive each year over 100 requests, we have decided to do 2 finals where you will be invited to present. Your presentation will be streamed to allow public voting.
The dates of the finals are:
April 11th at 18:30 – AI start-up idea of the year
May 16th at 18:30 – AI project of the year
You will also be invited for a short video interview that will be used to help you gain extra votes from the public.
As governments make more data publicly available, it becomes critical to understand the ways in which that data can be used to benefit society. How can government leaders improve service delivery and social impact through enhanced open data strategies?
This recent playbook gives the an interesting overview of the challenges.
Here is an overview of the proposed articles:
Turning public data to the public good
How CDOs can promote machine learning in government
Implementing the DATA Act
How CDOs can manage algorithmic risks
Connecting data to residents through data storytelling
How CDOs can overcome obstacles to open data-sharing
It is high time to select the leading topic for the 5th Data Innovation Summit that will be held at the university of Brussels on June 26th, 2019.
Most of you liked our new location and our partnership with Icity.brussels , also the timing end of June was perceived as the ideal moment to do a team event just before the holidays start.
Now that we have demystified A.I. I suggest that we focus on “How AI can save our humanity” or “Being Human in the age of AI”:
Ensuring that AI serves the public good requires the public to know how the platforms are deploying these technologies and how they shape the flow of information through the web today. However, as many others have pointed out, the level of transparency and accountability around these decisions has been limited, and we’re seeking ideas that help to raise it. This might be new policies in the form of draft legislation, or technical tools that help keep an eye on the information ecosystem.
I propose that the plenary presentations should be talking about:
AI serves the public good
Create an ethics and standards toolkit for using AI
Promote accountability in the use of AI
AI and Justice
AI and qualitative information
How we can build AI to help humans, not hurt us
Artificial intelligence (AI) in the news ecosystem
How to get empowered, not overpowered by AI
How do we make artificial intelligence work for unique community needs and cultural contexts
Can we build AI without losing control over it ?
Principles of creating a safer A.I.
Being human in the age of AI
How AI can enhance our memory, work and social lives.
I must admit I got inspired by the Ethics and Governance of AI Initiative, a hybrid research effort and philanthropic fund that seeks to ensure that technologies of automation and machine learning are researched, developed, and deployed in a way that vindicate social values of fairness, human autonomy and justice.
Let’s find a good title for the event first and meet early next year to discuss how we want to organise dis2019.
Please register for this workshop on eventbrite: https://launching-dis2019.eventbrite.co.uk
The volunteers prepared an impressive dataset for us ! Over the past 6 months they organised 17 activities with over 350 attendants in order to be ready for the HIVHACK that will start Friday and end Saturday. This #data4good initiative is driven by the AI & Data Science Community of Belgium supported by Johnson & Johnson Global Public Health. Their main request to you is: Join us … because we can use some extra brains to save the world from HIV.
Your Mission Should You Choose to accept it … is to build a solution that can be used by local data experts to map out areas of high drug resistance prevalence.
You will be part of a global effort composed of 100 hackers & 15 coaches. There will be about 15 teams created on the first day.
The winning team will go to Africa and train the local data experts to use their solution. Teach them how to fish and help them to build a local sustainable datalab.
In case you need some supporting material to convince your friends to join, here is some interesting information about the Hack:
This Friday 23rd at 13:00 we will hack our way into the HIV drug resistance problem.
HIV drug resistance is a problem that every person on therapy will eventually face. It can sometimes develop on its own after years of treatment or, more commonly, when a person fails to take the drugs as prescribed.
Following the massive increase of people on treatment (from less than 800 000 in 2000 to more than 18 millions in 2016), a non-negligible rise in the prevalence of resistant HIV strains has emerged and threatens the global commitment to end the HIV epidemic by 2030.
If no action is taken now, and this problem left unaddressed, HIV Drug Resistance could lead to a new crisis in the HIV epidemic.
Through this data4good hackathon we will get an understanding of epidemiological and non-epi factors influencing the emergence of HIV-DR and a model mapping the spots. This will result in dynamic and up-to-date graphical and visual representations of HIV-DR maps allowing governments, health workers and other stakeholders to be informed on the topic and to turn the understanding of the drivers into actionable items in the prevention and countering of HIV-DR. As part of our mission we intend to educate on the approach and outcomes locally.
About the HIVHACK :
We are happy to invite you to our yearly data4good hackathon that will be held at DigitYser.
1. You’ll Need To Be A Specialist, Not A Generalist
2. Understand The Business Reasons Informing Your Choices
3. It’s Best To Have Cross-Department Expertise
4. Explain Technical Concepts To Non-Technical Audiences
5. You’ll Spend A Lot Of Time With Raw Data
6. Collaboration Is Key (You Won’t Work In A Vacuum)
7. Be Flexible And Consider Context
8. Regular Maintenance And Version Control Is Essential