A maker used OpenAI Codex to convert a hand sketch and two dimensions into parametric Python generators for a printable kids' pegboard system.
A developer/maker shared a Hacker News project where they photographed a sketch, pasted it into Codex with just two measurements (40mm hole spacing, 8mm peg width), and got working Python geometry generators in about a minute. The result: a 40mm pegboard system with 7 play pieces, 4 gears, and 2 printable boards — all defined as small Python scripts rather than hand-modeled meshes. The repo includes an AGENTS.md file explicitly designed to let coding agents extend the system safely. The project demonstrates a sketch-to-3D-print pipeline with minimal manual CAD work.
This project demonstrates a concrete agentic loop: image input → dimensional constraints → parametric Python generators → iterable physical output. The key architectural insight is keeping geometry as code (Python functions) rather than binary mesh files, which makes agent-driven iteration trivially fast. The AGENTS.md pattern — a machine-readable spec file for safe extension — is a reusable convention worth stealing for any generative design or config-heavy project.
Clone the repo, run the three generator scripts locally, then prompt Codex to add a new piece variant (e.g. a T-shaped connector for 3-peg intersections) by pointing it at AGENTS.md — measure how many lines of manual code you wrote vs. generated.
Clone the repo and run: python3 -m venv .venv && . .venv/bin/activate && pip install -r requirements.txt
Tags
Sources
Signals by role
Also today
Tools mentioned