Trending this week: AutoML with Auto-Keras; Extended Transformer Construction & BigBird sparse attention mechanisms by Google AI; Use Machine Learning as a tool, not the solution!
Every week we analyze the most discussed topics on Twitter by Data Science or AI influencers.
The following topics, URLs, resources, and tweets have been automatically extracted using topic modeling. Want to know more? Jump to the end of this article!
This week, Data Science and AI influencers on Twitter have talked about:
- Machine Learning & AI-Based Products
- Amazing Deep Learning Resources
- Internet of Things
Here are all the details for each topic:
Machine Learning & AI-Based Products
This section talks about: how machine learning can help businesses and how to use it to successfully build AI-based products.
Marcus Borba shared an article titled What Walmart learned from its machine learning deployment. This article stipulates that “successful Machine Learning implementation means use automation as a tool, not the solution.” In particular, it outlines the top challenges of implementing automated solutions and how Walmart overcame them in product development, using Machine Learning as a tool.
Ipfconline shared an article titled Types of ML-Driven Products, and How to Build Them. This article presents the different types of Machine Learning-driven products and provides 5 essential considerations to take into account when building such products.
Ipfconline also shared an article on How Machine Learning and AI Are Helping Developers. This article explains how AI and Machine Learning can benefit developers in their work. It provides 5 examples of tasks for which developers can harness the capabilities of machine learning and AI.
Amazing Deep Learning Resources
This week, a number of resources on deep learning have been shared, including: articles and papers on state-of-the-art deep learning, automated Machine Learning (AutoML) and a tool for automated monitoring of deep learning models.
Google AI has shared an article published on the latest news from their blog on March 2021 titled Constructing Transformers For Longer Sequences with Sparse Attention Methods. This article explained how “sparse attention methods reduce the dependency from a quadratic to linear and achieve state-of-the-art results on challenging tasks.” It details the implementation of their Extended Transformer Construction (ETC) and BigBird attention mechanisms.
He also shared a tutorial on Image similarity estimation using a Siamese network with a triplet loss freshly published on keras.io website. This architecture allows models to “learn to estimate how similar two images look like.”
Pytorch has shared an article introducing Pytorch Profiler, their new and improved tool. This tool “collects both GPU and framework related information, correlates them, performs automatic detection of bottlenecks in the model, and generates recommendations on how to resolve these bottlenecks. It also provides relevant visualization.
Internet of Things
This week a number of influencers shared with their followers plethora of resources, use cases, and latest trends in the IoT domain.
In order to find Twitter most discussed topics within the data science or AI community, we created a whole pipeline combining influencers analysis, data extraction, and NLP using the BERTopic Python library, a topic modeling technique that leverages:
- Sentence Transformers, to obtain a robust semantic representation of the texts
- HDBSCAN, to create dense and relevant clusters
- Class-based TF-IDF (c-TF-IDF), to allow easy interpretable topics whilst keeping important words in the topics descriptions.