October 29, 2023
-
Government
AI in public services will require empathy, accountability – Australian Government
Mint, 10/29/23. Physics-informed machine learning is an approach that combines knowledge of the natural world, continuously gathered data, and machine learning to solve complex problems. By incorporating the laws of physics into AI systems, such as creating simulations with basic physics principles, the algorithms have a starting point to make useful predictions. This approach has the potential to boost the range of electric vehicles, improve cancer patient care, and automate tasks previously done by humans. Physics-informed machine learning systems can make accurate predictions with less data, providing valuable insights and applications in various fields. READ THE ARTICLE
-
Use Case
AI and ML set to boost industry’s automation push
The Manufacturer, 10/29/23. According to a new survey by Make UK and Infor, manufacturers are expected to increase their use of AI and machine learning to achieve greater automation and improvements in productivity, efficiency, and quality. The survey reveals that companies are investing in automation across various technologies and functions, with plans to accelerate these investments in the next two years. Despite these positive developments, the majority of manufacturers believe the UK is falling behind competitors. The survey highlights barriers to further automation, including technical skills, data integration, and workplace culture. Make UK is urging the government to roll out the successful Made Smarter scheme nationwide to support SMEs in adopting digital technologies and address skills shortages. READ THE ARTICLE
-
Limitations
From batter to better: How AI can’t (yet) hit a home run in business
The Globe and Mail, 10/29/23. Artificial intelligence (AI) has become a prominent topic, influencing stock market trends and job concerns. However, it is important to recognize the limits of AI as it currently stands. While AI can make predictions, it cannot make judgments that require personal preferences and experiences. This is evident in various aspects, such as the investment world’s differing risk tolerances and the decision-making process in sports like baseball. The interplay between prediction and judgment highlights the need for human involvement in decision-making. Companies and investors should be cautious about solely relying on AI without considering human judgment. READ THE ARTICLE
-
YouTube
YouTube has AI creator tools, but creators are too busy battling AI to care
Polygon, 10/29/23. YouTube recently introduced a range of artificial intelligence (AI) tools for content creators, covering various aspects of the content creation process. Despite the controversy surrounding generative AI in other creative industries, the response to YouTube’s new suite of tools has been relatively muted. Many YouTubers have expressed concerns about the ways AI is already affecting the platform, such as copyright issues and inaccuracies. While these AI tools have been used experimentally and professionally by some creators, existing creators are not particularly interested in them. Some worry that AI tools may de-skill newer creators and discourage their creative growth. The ultimate impact of YouTube’s AI tools on the platform remains to be seen. READ THE ARTICLE
-
Energy
Generative AI’s Energy Problem Today Is Foundational Before AI can take over, it will need to find a new approach to energy
IEEE Spectrum, 10/29/23. The popularity of generative artificial intelligence (AI) products like ChatGPT and Midjourney has brought the technology to new heights, but it comes at a steep energy cost. Current AI technology could consume as much electricity as the entire country of Ireland annually, according to a report by Alex de Vries. While the training process of large language models (LLMs) raises environmental concerns, the energy consumed during inference can be even higher. Efforts to address AI’s energy problem include optimizing hardware and algorithmic approaches to improve energy efficiency without compromising accuracy. However, these solutions may inadvertently contribute to the further growth of AI. Human self-regulation and critical evaluation of AI integration are crucial to curbing energy consumption. Institutions and developers need to recognize AI’s limitations and consider if it is the best fit for every solution. Users can also contribute by discussing the environmental impact and promoting transparency and monitoring of AI sustainability. READ THE ARTICLE