
Microsoft has an AI image creator that is “powered by DALL-E.” Can it work to create images of hypothetical tools?
Let’s play around with it.
The first idea that came to mind is a Dewalt cordless drill in Milwaukee red colors. That seems easy enough, right?
Advertisement

This certainly has drill-like features, such as a handle, battery, trigger switch, chuck-like thing.
Where did the Dwlill brand name come from?

Here’s a Duvil drill-like thing. I guess the AI image isn’t identifying Dewalt as a brand name. There’s something interesting going on with the chuck, and the clutch looks immobile.

Diailt?
This one has air vents where they’d never go, and the trigger switch looks fixed in place.

The chuck is off-center. The handle looks to be in the style of Milwaukee’s M12 installation driver, but with a chunkier forward section. Is there where the battery is supposed to go?
Advertisement
Let’s try something simpler:
“Yellow drill making holes in wood.”

It looks like these could be real, if not for all of the ways they’re blatantly wrong.
How about a Milwaukee cordless drill in Dewalt yellow colors?

You can tell this is a cordless drill, albeit not a realistic one.

The basic geometry on all of these is correct, at least of you squint.

I think the problem is that the AI generator is mashing things together without recognizing there are discrete components.

For all of the renderings, the AI engine seems to assume the drill bit is part of the chuck.

This is what it thinks a “modern cordless power drill” looks like. They’re better than the other examples.
The AI image generator seems to do a better job with less specific queries.
As mentioned, the AI seems to treat drills as monolithic products, but they’re not; they’re assemblies of multiple parts.
Steven
Made some with Midjourney (paid)!
Dewalt cordless drill in Milwaukee red colors
https://cdn.midjourney.com/8ec3007b-d608-4b89-9511-f9bf34bba93c/0_0.png
https://cdn.midjourney.com/8ec3007b-d608-4b89-9511-f9bf34bba93c/0_2.png
Milwaukee cordless drill in Dewalt yellow colors
https://cdn.midjourney.com/4b49ee77-5df3-44dc-858a-5dd42d5dfe59/0_2.png
https://cdn.midjourney.com/4b49ee77-5df3-44dc-858a-5dd42d5dfe59/0_1.png
Stuart
Those are closer! I wonder if the source sampling size is smaller.
Andrew
Could be the free tier is deliberately dumbed down to save on costs.
Jason
I like the “dee-Walt” pronunciation
Scott ALKB
WALT
George
Generating just-slightly-off, horror-movie-esque images that come eerily close to lifelike but still the stuff of nightmares is DEFINITELY the strong suit for AI image generation.
The artists in our office have been having a ball with it and don’t seem to be worried for their jobs at all. From what I’ve seen, I agree.
And it’s quite fun to feed it non sequitur requests.
Stuart
Oh, that’ll be the next post.
I fed the AI simple but specific prompts, and the results ranged from grotesque to debilitatingly amusing.
The only realistic results were anything involving kittens, but none were true to the prompts.
Perry
It’s akin to the “uncanny valley” effect where, once a robot becomes so close to human it’s almost authentic, it becomes frightening.
JoeM
If I could offer a simpler explanation… The AI is forbidden from generating anything copywrite or patent protected. Therefore, it lacks all reference to the proprietary companies’ actual tool images. Otherwise, it could simply change the colours. After all, it can search for what colour red or yellow it is, in hex value, based off other images on the Internet. But the second Bing rules out the actual name of the company, the Bastard AI has to obfuscate what it was told in your search, which also changes several variables it has been fed.
So, without reference to a real spiral drill bit, or an exact model to show you, it uses fractals based on the description “Drill” which “Twists” (hence the Salvadore Dhali inspired tips to the drills, bordering on liquid metal.) that is listed in its databank of algorithms for image generation. This also would change the icon settings for drill, or hammer, to visible examples of the action, not the actual icons.
The weird names are just text obfuscation though. Even the numbers and fonts are being altered at random in order to avoid any association with DeWALT or Milwaukee logos or text.
Though, I will have to surrender on the set of four “modern drill” examples… I confess the top left is dangerously close to the design of a Mastercraft or Maximum drill that genuinely exists at Canadian Tire right now. How that is possible, considering what I just said about DeWALT and Milwaukee copywrite protections? Usually the CT house brands have significantly more subtle livery. It’s easy to see them as extremely generic. So, the AI generating something (four things, really) had a very high chance of duplicating a real Mastercraf, Motomaster, or Maximum branded Canadian Tire brand tool… They’re generated a little more curved than the CT house brands, but they do hit dangerously close to a bullseye here.
Stuart
If the AI must obfuscate brand names, why change it slightly in each independent render? I think that it generally has difficulty with text. It combines and then attempts to refine any skewed words or markings.
Perry
Because each render is isolate in its creation. So each one is essentially a randomized creation of the same theme. It doesn’t recognize text any differently than it does the drill bit
JoeM
This particluar Bastard AI (Incomplete mini-segment of a full AI for the purposes of passing a Turring Test. They’re called Bastard AI, because they are designed to test a specific part of the final AI’s “Personality” one could say. They’re each a particular skillset or knowledgebase that is meant to adapt, or activate, when triggered by a final version Turring level AI, to respond to a specific human being interacting with it. In this way, each one adapts and becomes a single subset, or subroutine, of the Turring AI they’ll make up together with other Bastard AIs.) absolutely has trouble with text. Especially Fonts.
It’s possibly my own bias stating that all Microsoft products have bugs, and always have. Much of my career was dealing with Microsoft, and it has seriously put them off my radar because of the frustration they cause me. This Bing search engine is now using a product of another company… I get it… But somewhere in there, the Microsoft product isn’t giving the entire set of parameters to the AI in question. Hence the “Why did it do that?” factor. If you could do the same with ChatGPT or any of the other ones in the wild out there, chances are good they’d be a little closer to the original inquiry.
But, again… I will say that this is entirely pointless, and a horrible use of AI. Akin to leaving a room full of Chimps in charge of a Nuclear Bomb. Chimps will be Chimps, and they will abuse the trust you’ve put in them. Which also means, despite being fully ready to attempt to integrate the different Bastards into a full Turring Test, thus fulfilling a nearly-100 year old technological milestone that we’ve been working toward as a species… we’re likely to forget ever performing the Turring Test with these AI components. And we won’t pass the point of Simulacra versus Simulacrum for Computer Intelligence. It’s… Just a toy to play with, as far as anyone knows, or cares. But their true purpose is far more intense than this, and could mean a giant leap forward in the ability to operate out lives.
ToolGuyDan
You have gone to a tremendous amount of effort to apparently add the wrong spellings of “copyright” and “Turing” to your spellcheck dictionary. Alan Turing, in particular, doesn’t deserve this dishonor.
As to your theory of something called a “bastard AI”, which can be “integrated” with its siblings “into a full Turring [sic] test”, I don’t know where to begin. I can’t find any evidence of the “bastard” term being used in anything remotely relevant, and the Turing Test is simply an open-ended text conversation with a human; chatbots have been (at least occasionally) passing it for years.
What it might mean when you say to “integrate [disparate-but-related AI models] into a full Turing test” is roughly as apparent as if someone said to “integrate the Impact font” into IIHS’s car crash testing. Does that mean use it for the printed report? The website? In the lab? Or do you want us to crash a car into a 3D rendering of the word “Impact”, written in Impact?
JoeM
You have an excellent point there, ToolGuyDan… The Turing misspelling is my mistake, and still unintentional.
To your other concerns? Bastard AI isn’t a standard term. It’s a term I’m using for other various terms that represent subsets of a full AI. I don’t want to list them, as it takes up a lot of text room,
But I will say that, in programming terms, these represent different aspects of a complete personality. ChatGPT can converse and demonstrate some creative output, others I can’t remember the name of can interact with you as if they are from a certain region of the world, or others can simulate emotional reactions based on tone of requests and such… the list goes on. There’s a lot of overlap with them, intentionally. Because, once decompiled from programs, to code, a full Turing AI is made up of all that code in a single, final, AI. They become one entire AI, with its own resources to draw from, compiled from the resources gathered by the subsets. The term “Subroutine” is insufficient here, but if the final version is the complete system, these small devotions to AI are the equivalent of what a Subroutine is to any Program. Subset-Intelligences. Dedicated to singular situations. Since they’re all Code, they can be decompiled, put together as a whole, and tested to see if the final compilation can use all the resources it has to be more than just a program. It has to be able to interact in any form of communication, without anyone knowing the difference. That is the Turing AI. And given Turing lived in a much less technologically complex time, Software engineers, and Nerds like myself, have since considered new test parameters to overcome all those complexities.
As to “Copyright”… I can never remember how it’s spelled properly. I apologize.
TomD
These look more like rejected Star Wars props than real drills.
The image AI has no understanding of text as text so it often gets horribly confused around them.
Mid journey is a bit better than DALLE if i rememb correctly.
MM
My thoughts exactly when I saw the first pic of the “milwaukee drill in dewalt yellow”. That doesn’t look like a drill to me, that looks like some kind of sci-fi laser gun straight out of a movie or video game.
Rog
Midjourney is a much better product than Dall-e and others.
AI still has a problem generating unique text, rendering jibberish instead
Andrew
Those look like an acid trip version of the power tools in question. Pretty wild.
OldDominionDIYer
Seems like a perfectly useless purpose for AI tech, wow! Is there an actual point attempting to be made??????????? Apparently, I’m out of touch…
MM
Imagine the difference in cost for generating photos for marketing or simply explaining a point:
-Ask the AI and wait a few seconds
vs.
-Hire models, and photographers, find an appropriate location, have demonstration tools and perhaps materials ready, etc.
OldDominionDIYer
If they are totally useless as these are AI is dead on arrival! LOL what a joke!
MM
Today’s AI is indeed a joke. But it’s only a matter of time. The Wright Flyer was a “joke” as well.
Stuart
Exploration of interesting new tech to satisfy personal curiosity?
ToolGuyDan
Think of something that, in the past, you’d have to hire a concept artist to mock up. “What are some ways you could add integral dust collection to DeWalt’s non-SDS drills?” The AI can spit out a half-dozen renderings in a few seconds, and while one or two might be obviously worthless, the other four will represent possibilities worth considering.
This isn’t that—Stuart isn’t a product manager for SBD, after all. So instead he gave us a little taste of AI in a different way. It’s still relevant, in my opinion. If it’s not relevant to you, then okay.
MFC
These are the things of nightmares. It’s kind of like when I was a kid and started noticing that artists wouldn’t factually draw some of the technical things in their drawings. I figured it was because the item wasn’t the central focus and could be smudged and still get the point across, but sometimes I knew it was because the artist didn’t understand how the item functioned. AI seems to be the latter.
Nathan
Will be up for sale on Amazon inside a month?
Jronman
Those might be able to pass for cheap Chinese knockoffs
Munklepunk
Here’s a DeWalt colored Makita.
https://ichibandepot.com/products/makita-td002gzfy-td002g-40vmax-xgt-impact-driver-yellow-tool-only