April 21: Physical AI & Robotics Dallas Live/In-Person Event
This Angelbeat Live/In-Person Event in Dallas on April 21 focuses on the rapidly growing and quickly evolving field of Robotics and Physical AI. Attendees have a unique opportunity to meet, interact with and learn from AWS, Amazon, Microsoft, Google and other world renown subject matter experts, plus see demonstrations of the latest robotic hardware.
By bringing these global leaders to the DFW Metroplex, Angelbeat facilitates invaluable peer networking, plus provides a highly personalized and local experience. You’ll also join the much larger Angelbeat online community, with access to proprietary content, special discounts on speakers’ products, services and software, plus invitations to future complimentary webinars and Ask-Me-Anything (AMA) online discussion groups.
The event takes place at the Doubletree by Hilton Dallas-Farmers Branch,11611 Luna Road Farmers Branch, with free parking and WiFi. Scroll down for the schedule. Click on the speaker’s name to view their Linkedin profile. CPE credit hours are provided. Use your Angelbeat account - easily and securely created on the Memberspace platform - to register to attend by clicking the green button.
The content is a balance of strategic and high-level technical information, designed to help corporate, facility, manufacturing and IT leaders understand the latest developments, plus create plans to deploy Physical AI and Robotics within their organization. You’ll gain invaluable and practical insights on expanding proof-of-concept installations to company-wide initiatives, including not just technology considerations but also financial justification, HR/Labor impact and other important topics.
There will be case studies on manufacturing, distribution, healthcare & life sciences, real estate/construction/property management, government agencies, industrial service & maintenance firms, financial services, agriculture/farming and more, all of which share many common issues such as:
Autonomous Robot Performance without Internet Connectivity
AI Compute Power at the Edge
Data Privacy/Encryption/Confidentiality Issues
Integration of Physical AI, Agentic AI, Information Technology (IT), Operation Technology (OT) Platforms
There will be demonstrations of various robots, with different form factors, hardware and applications including:
Micro robots: medical and surgical applications
Mobile robots: floor cleaning
Humanoid robots: complement or duplicate human movement
Warehousing robots: logistics for pallet and shipment movement
Manufacturing robots: specific use cases/production environments
Click here to learn about the prestigious Angelbeat Robotics Advisory Council (ARAC) DFW, a prestigious group of highly respected industry, technology and community leaders in the DFW Metroplex. ARAC members have provided strategic guidance to CEO Ron Gerber on the overall agenda and topics, plus will select two award winners, for the best presentation and best exhibit/demonstration at the event.
8:30 - 9:30: Attendee Registration, Coffee & Continental Breakfast, Exhibit Area Open
9:40 Maria Olson, Principal Advisor, AI and Data Strategy Leader, AWS/Amazon
Learn how Amazon is integrating Robotics, Physical AI, Automation, Agentic AI and Data Analytics, to build the world’s foremost logistical, warehousing and delivery system. Get an “insider” perspective on what goes on within an Amazon warehouse, plus learn best practices and technology strategies that you can deploy within your organization.
10:10 David Randle, Worldwide Head of GTM - Physical AI, AWS
The world is moving towards an Autonomous Economy which represents a transformative economic model where AI, edge computing, robotics, spatial intelligence, and simulation technologies work together to enable systems to operate autonomously with minimal human intervention. Physical AI represents the convergence of these technologies enabling computers to sense, understand, predict, and act with the physical world, creating unprecedented opportunities for customers to embrace the transition towards this Autonomous Economy.
Physical AI underpins the paradigm shift to autonomous operations and advances traditional AI systems that operate purely in digital environments to intelligent systems that can perceive, understand, and act in the physical world. This technology set is transforming everything from transportation (self-driving cars) to manufacturing (lights-out manufacturing facilities) to energy (minimal humans on-site and automated inspections of hazardous areas) to healthcare (minimally invasive robotic surgeries) and much more. In a prior blog, AWS proposed a 4-level Physical AI capability spectrum which describes WHAT levels of autonomy Physical AI can enable. Now we will provide guidance on HOW to achieve these levels of autonomy. An example can be found in the following blog featuring Diligent Robotics for healthcare.
In this session, we describe a holistic Physical AI framework as a blueprint to chart your course towards automation. The framework breaks down Physical AI from an abstract concept into practical concrete capabilities that can be developed and integrated into your technical development roadmap. It addresses use-cases today and prepares you to solve challenges tomorrow. It describes a continuous learning loop that connects the physical world (atoms) to the digital world (bits) that accelerates development of autonomy in physical operations. Lastly, we clarify the difference between Physical AI model training in the virtual world and real-time autonomous operations in the physical world and explain how the two are connected in cloud to edge hybrid deployment.
10:40 Priya Aswani, Director, Architecture & Strategy, Agentic AI & AI Infrastructure
Priya will begin her presentation with a discussion of how Microsoft deploys robots in both building, managing and optimizing the performance of its own data centers.
Then Priya will discuss Rho-alpha, Microsoft’s Innovative Platform for Advancing AI for the Physical World
Physical AI, where agentic AI meets physical systems, is poised to redefine robotics in the same way that generative models have transformed language and vision processing. Microsoft recently announced Rho-alpha (ρα), its first robotics model derived from Microsoft’s Phi series(opens in new tab) of vision-language models.
Rho-alpha translates natural language commands into control signals for robotic systems performing bimanual manipulation tasks. It can be described as a VLA+ model in that it expands the set of perceptual and learning modalities beyond those typically used by VLAs. For perception, Rho-alpha adds tactile sensing, with efforts underway to accommodate modalities such as force. For learning, we are working toward enabling Rho-alpha to continually improve during deployment by learning from feedback provided by people.
Through these advancements, we aim to make physical systems more easily adaptable, viewing adaptability as a hallmark of intelligence. We believe robots that can adapt more easily to dynamic situations and to human preferences will be more useful in the environments in which we live and work and more trusted by the people who deploy and operate them.
11:10 - 1:00: Exhibit and Product Demonstration Area Open, Lunch is Served/Provided
PickNik Robotics helps companies bring their robotics vision to life—from ideation through to production. As experts in unstructured robotics, PickNik enables organizations to develop, validate, and deploy advanced robotic so lutions more quickly and with less risk. The team’s mission is to make robotic automation accessible and impactful for every industry.
MoveIt Pro is PickNik Robotics’ commercial robotics software platform that extends the capabilities of the open-source MoveIt project. Designed for unstructured environments, MoveIt Pro provides powerful tools for motion planning, grasping, perception, and behavior sequencing. With pre-built robot behaviors and a modular runtime architecture, MoveIt Pro significantly reduces engineering effort and time-to-deployment. Its hardware-agnostic design supports a wide range of commercial and custom robotic arms, making it ideal for scalable automation across space, manufacturing, and logistics sectors.
1:30 Rusty Harrington, Technical Product Leader, AI & Robotics, Kubota
Rusty will speak about Kubota’s innovative, AI-Powered, Autonomous Tractors and Construction Machinery, used in Agriculture/Farming and Real Estate/Property Development.
Rusty will specifically demonstrate their Bloomfield Camera and AI-Powered Visual, Spatial and Pattern Recognition technology.
2:00 Salah Ahmed, Head of Global AI Partnerships, Google
Learn how Google enables its customers and partners to seamlessly and securely integrate Agentic AI and GenAI “digital software” platforms, with Physical AI and Operational Technology (OT) applications and hardware.
2:30 - 5:00: Exhibit and Product Demonstration Area Open, Closing Reception