Micro1 is paying thousands of global contractors $15/hr to record themselves doing household tasks, generating physical-world training data for humanoid robots.
Startup Micro1 has recruited thousands of gig workers across 50+ countries — including Nigeria, India, and Argentina — to mount iPhones on their heads and record domestic tasks like folding laundry and washing dishes. Workers earn $15/hour, strong pay in many local economies. The data is intended to train humanoid robots on real-world physical manipulation, a bottleneck that simulations cannot solve. MIT Technology Review flagged privacy and informed consent as unresolved issues with this model.
The core technical problem this solves is blunt: physics simulation is still too inaccurate to train manipulation policies that generalize to messy real homes. Micro1's approach is essentially human-in-the-loop imitation learning at scale — egocentric video feeds that can be converted into action trajectories for behavior cloning or RLHF-style robot training pipelines. If you're building anything in embodied AI, the data pipeline architecture here (iPhone-mounted POV + structured task annotation) is the template to study and potentially replicate or partner with.
If you're working on a manipulation or embodied AI project, benchmark your simulation data against published real-world datasets (e.g., Open X-Embodiment) this week — measure how much your policy degrades on out-of-distribution physical tasks to quantify exactly how much real-world data you actually need.
Clone the Open X-Embodiment dataset repo: git clone https://github.com/google-deepmind/open_x_embodiment
Tags
Signals by role
Also today
Tools mentioned