Data Science Icon

Data Science

128 Stories
All Topics

Practical AI Practical AI #183

AI's role in reprogramming immunity

Drausin Wulsin, Director of ML at Immunai, joins Daniel & Chris to talk about the role of AI in immunotherapy, and why it is proving to be the foremost approach in fighting cancer, autoimmune disease, and infectious diseases.

The large amount of high dimensional biological data that is available today, combined with advanced machine learning techniques, creates unique opportunities to push the boundaries of what is possible in biology.

To that end, Immunai has built the largest immune database called AMICA that contains tens of millions of cells. The company uses cutting-edge transfer learning techniques to transfer knowledge across different cell types, studies, and even species.

Practical AI Practical AI #171

Clothing AI in a data fabric

What happens when your data operations grow to Internet-scale? How do thousands or millions of data producers and consumers efficiently, effectively, and productively interact with each other? How are varying formats, protocols, security levels, performance criteria, and use-case specific characteristics meshed into one unified data fabric? Chris and Daniel explore these questions in this illuminating and Fully-Connected discussion that brings this new data technology into the light.

Practical AI Practical AI #166

Exploring deep reinforcement learning

In addition to being a Developer Advocate at Hugging Face, Thomas Simonini is building next-gen AI in games that can talk and have smart interactions with the player using Deep Reinforcement Learning (DRL) and Natural Language Processing (NLP). He also created a Deep Reinforcement Learning course that takes a DRL beginner to from zero to hero. Natalie and Chris explore what’s involved, and what the implications are, with a focus on the development path of the new AI data scientist.


Get the daily Wordle on the first try using the tweet distribution

I love how much hacking has been inspired by Wordle.

The Wordle source code contains 2,315 days of answers (all common 5-letter English words) and 10,657 other valid, less-common 5-letter English words.

We combine these to form a set of 12,972 possible words/answers.

We then simulate playing 1,000 Wordle games for each of these possible words, guessing based on the frequency of the word in the English language and the feedback received.

Then we take three measures to evaluate the observed distribution of ⬛🟨🟩 squares on Twitter according to our valid words.

The resulting code is included in the article.

Alex Strick van Linschoten

ZenML helps data scientists work across the full stack

ZenML is an extensible MLOps framework to create production-ready machine learning pipelines. Built for data scientists, it has a simple, flexible syntax, is cloud and tool agnostic, and has interfaces/abstractions that are catered towards ML workflows.

The code base was recently completely rewritten with better abstractions and to set us up for our ongoing growth and inclusion of more integrations with tools that data scientists love to use.

Practical AI Practical AI #160

Friendly federated learning 🌼

This episode is a follow up to our recent Fully Connected show discussing federated learning. In that previous discussion, we mentioned Flower (a “friendly” federated learning framework). Well, one of the creators of Flower, Daniel Beutel, agreed to join us on the show to discuss the project (and federated learning more broadly)! The result is a really interesting and motivating discussion of ML, privacy, distributed training, and open source AI.

Practical AI Practical AI #158

Zero-shot multitask learning

In this Fully-Connected episode, Daniel and Chris ponder whether in-person AI conferences are on the verge of making a post-pandemic comeback. Then on to BigScience from Hugging Face, a year-long research workshop on large multilingual models and datasets. Specifically they dive into the T0, a series of natural language processing (NLP) AI models specifically trained for researching zero-shot multitask learning. Daniel provides a brief tour of the possible with the T0 family. They finish up with a couple of new learning resources.

Practical AI Practical AI #157

Analyzing the 2021 AI Index Report

Each year we discuss the latest insights from the Stanford Institute for Human-Centered Artificial Intelligence (HAI), and this year is no different. Daniel and Chris delve into key findings and discuss in this Fully-Connected episode. They also check out a study called ‘Delphi: Towards Machine Ethics and Norms’, about how to integrate ethics and morals into AI models.

Practical AI Practical AI #156

Photonic computing for AI acceleration

There are a lot of people trying to innovate in the area of specialized AI hardware, but most of them are doing it with traditional transistors. Lightmatter is doing something totally different. They’re building photonic computers that are more power efficient and faster for AI inference. Nick Harris joins us in this episode to bring us up to speed on all the details.

Practical AI Practical AI #154

🌍 AI in Africa - Makerere AI Lab

This is the first episode in a special series we are calling the “Spotlight on AI in Africa”. To kick things off, Joyce and Mutembesa from Makerere University’s AI Lab join us to talk about their amazing work in computer vision, natural language processing, and data collection. Their lab seeks out problems that matter in African communities, pairs those problems with appropriate data/tools, and works with the end users to ensure that solutions create real value.

0:00 / 0:00