I’ve all the time considered 3D modeling — and by extension, 3D printing — as a visible medium. Whereas 3D-printed objects are actually bodily, your entire software program chain that results in them exists solely within the digital world. So my assumption was that, sadly, this pastime just isn’t viable for those that reside with visible impairments. However Redditor Mrblindguardian proved me incorrect by growing an AI-based workflow that lets him mannequin and 3D print his personal customized designs, resembling a one-winged dragon.
Along with the apparent challenges, this comes with some difficulties that our sighted readers is probably not conscious of. We’ve got language to explain what we see, however that doesn’t maintain the identical that means to individuals who have by no means been capable of see.
For instance, contemplate a query posed by William Molyneux in 1688: “May a blind particular person, upon all of the sudden gaining the power to see, acknowledge an object by sight that he’d beforehand identified by really feel?”
In 2011, researchers at MIT answered that query by testing the premise within the real-world utilizing topics that obtained sight-restoration procedures. The outcomes confirmed that tactile understanding didn’t carry over to the visible world. This could offer you some perception into the challenges Mrblindguardian confronted.
His answer is ingenious and takes benefit of AI instruments that solely lately turned out there. Mrblindguardian begins by typing out an outline of what he thinks a dragon seems to be like, with the assistance of googled descriptions. He then makes use of Luma AI’s Genie service to generate a 3D mannequin based mostly on that description.
To confirm that the mannequin “seems to be” proper with out the power to see it, Mrblindguardian takes screenshots of the generated 3D mannequin and feeds these to ChatGPT to explain. If the AI-generated description matches his expectations, then he is aware of that the mannequin seems to be proper—at the very least to ChatGPT. If it doesn’t, he can refine his Luma AI Genie immediate and repeat that course of till the outcomes are passable.
With an acceptable STL file, Mrblindguardian can then use slicing software program that’s suitable with display readers. To get a greater sense of what’s on display, he can even have ChatGPT generate descriptions from screenshots. One he’s pleased with the outcomes, Mrblindguardian can ask a sighted pal to confirm that the file is able to print. If that’s the case, he can print it after which course of it by really feel.
It is a laborious course of, however it works. Mrblindguardian used it to 3D-print this practice one-winged dragon, bringing a creature from his creativeness into the real-world the place he can really feel it himself.
I can’t assist however really feel tremendously impressed and impressed by Mrblindguardian’s achievement, and I hope that others are capable of benefit from this workflow to provide their very own designs.