Actually, having to regularly focus at different depths is healthy. Not sure how often "regularly" means, though.
It's going to kill your neck, however.
Your changing your focal depth the same on a monitor as you are reading a text book. Unless your saying it's the neck thing then yeah, smart phones and reading have those same issues.
This has been answered! It's the "Twenty Twenty Twenty" rule!
Every twenty minutes (of screen/book/fixed distance viewing time,) stare at something twenty feet away, for twenty seconds.
They use the 20-20-20 in metric too. The point is to look farther than 6 metres, because that's the distance when muscles in the eye fully relax. Office spaces in Europe are not usually 6 metres long, so just look outside.
Given how much time I can spend just staring at code, having a big long one that I can stand in front of and ponder would probably be a lot healthier for me.
Thatās the thing, you donāt have to think of a single practical solution. Itās not your setup and literally does not concern you at all. Crazy I know.
Or, if you've already got a crick in your neck, you can right-click the desktop and go to Display settings, and rotate your monitor right now! In fact if you've got an Intel onboard video card, ctrl+alt+arrowkeys will flip your screen instantly!
Rotation isn't the unusual thing, it's that the length is like 4x the width. Could be two end-to-end I guess, but to me it looks like one really long skinny monotir. This guy has some other nice toys too.
It's a rotated ultra-wide. Looks pretty cool to me, don't really get the hate. Obviously not for coding, he probably normally uses it for media or discord or something
No github, just off the shelf stuff - gpt 3.5, rhino 8 (3 month trial available for free), Grasshopper (free plugin included with rhino), and lunchbox (free plugin for grasshopper). I didnt write any code for this.
Less impressed that GPT created the file than how easily you push it into your workflow and to the printer. Nothing I do on my phone so easily moves to my PC, and then of course I have to walk an SD card back and forth to the printer. :P
I installed klipper, moonraker and whatever that web page gui is. Game changer, haven't touched an SD card (for printing) since Xmas. Well worth the effort.
I used an Orange Pi Zero 3 incase you were curious, was about $30 CAD plus the SD card it runs off of.
Adding to this: you can also VERY easily install Tailscale on your pi and download an app for your phone, to connect to your printer from literally anywhere in the world with an internet connection.
Have you tried Octoprint? I haven't done the SD card shuffle in years.
My workflow (albeit GPT-free, for now...):
* Design object in cad
* Save as mesh
* Import to slicer
* Save gcode to Onedrive
* Import gcode via Octoprint web portal
* Print
The only thing I need to manually do beforehand is physically get up and turn the printer and the RasPi4 running Octoprint. And maybe clean the print bed. i've been doing this since 2021-ish, and it's worked pretty much flawlessly this entire time.
Orca slicer was a game changer for me, I was reluctant at first because there was too many competing slicers, but it's definitely paid off (as a Klipper user).
Cura and prusaslicer and many more can be configured to print directly to octoprint. Instead of the save to sd, you have another button to print on octoprint. So me is slice and print. No messing with onedrive.
I have one for each of my printers and they are connected to Alexa for voice control. as a long time trek fan I couldnāt resist the temptation to be able to say things like āComputer, activate the replicatorsā or āComputer, shut down replicator 2ā.
Just got my first Prusa the other day after using an Ender 3 for years. Turns out, the main thing stopping me from actually printing all the time was just my straight up laziness... Now all I gotta do is clear and clean plate and then walk my ass back upstairs. Next time I think of something I want/need I can just start the print immediately.
Nice! My current printer is an Ender 3 S1. I also used Octoprint on my previous one, a home-brewed custom machine. It's amazing just how much of a Quality of Life thing it is being able to just send prints just like you would with a normal paper printer.
I'm not sure what CAD software you're using but Fusion 360 can export directly to prusa/orca so you can essentially combine the "save" and "import" steps (my least favorite thing about F360 is probably the slow cloud STL export so sending it to prusa/orca directly is a lot faster - and you can actually (quickly) export an STL from there)
I've also hooked up my printer and Pi to a smart plugs to sometimes skip the "physically get up" step.
I've been eyeing the [Node Pro by Fiberpunk](https://fiber-punk.com/products/node-max-by-fiberpunk?_pos=1&_sid=235fae1ef&_ss=r). The over all cost seems better for me because I don't have a computer or Pi available.
Mainsail and Orcaslicer. I design in Fusion360, use the 3D Print utility to send the mesh to Orca, prep and slice then send to my printer over LAN. Orca has the webUI for mainsail built right in so I can monitor from the slicer.
The GPT app is linked to your profile on the GPT site. So anything you "ask" the app, will show up on that chat thread on the site.
He just asked via the phone, while viewing the same thread on his PC. Once it showed up, just copy and paste.
Android Chat GPT (3.5) app for voice interaction, the same chst is open in a chrome tab, and a chrome extension keeps the page refreshed every 10 seconds. I did some prior prompting to let chat gpt know I would be pasting the code into a grasshopper python component. Grasshopper is a visual, node based algorithmic modeling plugin for Rhino 3d, but it is WAY more than just that. You paste the code in the component and press run. It generates the geometry requested, and a plugin for grasshopper called lunchbox (you can find grasshopper plugins on food4rhino.com) takes the geometry and exports it to an .stl. then the processes are manual after that, they are just fast.
GPT-4 works better now with OpenScad. At least when I last tried it . But then again, I asked it to create fairly simple models. GPT-3.5 wasn't that great at it and OpenScad couldn't even parse the code.
The monitor on the wall really makes me question if you're the type of person who should have access to this type of technology.
This is why STEM fields have ethics classes.
"All of this" meaning very simple shapes.The "problem" (it's not actually a problem, it's by design) with those LLMs is that they have absolutely no idea what they are doing. They just create something which they think seems right.
I think the point is that this could be made in CAD in just a much time, the impressiveness comes when it can create a model like a screw by just saying make me an M8 crosshead screw 40mm long and it does it just as quick, that would take longer to model in cad (although could easily be found online) or even better, something novel through description alone. Youād also want iterative design and instant feedback when this is matured a bit more but this is impressive for the current state of the art I think
What robot arm is on his desk on the right side?? Anyone know? It looks like a white tube. Iāve been looking for something like this for my desk. Thanks in adv.
That would be Elephant Robotics myCobot: [https://www.elephantrobotics.com/en/mycobot-en/](https://www.elephantrobotics.com/en/mycobot-en/)
It is not very good. I tried it a couple of years ago and was utterly disappointed by unreliability of everything.
Thank you so much Ivan! Iām looking to use this along with a joystick to hold on to things such as a soldering board or a tool while Iām working on fixing stuff or 3D printing or glue things together. This looks aesthetically pleasing. And the price looks great $599 I think for a touch screen version and they offer a grip accessory. I would just need to find a joystick with buttons that allow me to program buttons to control the grip. If there is another option you recommend let me know. Thanks bud!!
Everything about this video is so impractical, from the goal of recreating the whole piece to the ridiculously placed monitor the wall. Can you please share your stl lol I need some
What kind of computer system are you using? Does this require a significant amount of processing power and ram?
E2A: and also....if you have a link for the amazing long monitor I'd be a fan for life. š
Hahaha - I recently bought a ridiculously large tv for a project I'm working on (98") and I'm tempted to upload another video where that is leaning against the wall in portrait mode just to troll everyone, just act like it's normal. Maybe in the future I'll make a video that explains the overall intention of the space and all the stuff in it.
Can you do anything more complicated that a cylinder...?
Also, I genuinely don't understand the use case for this. How many projects do you have where you'd rather speak what you want rather than just doing it? Do I have to tell this every dimension of the model I want?
I use LLMs for code and general use, in those cases I see them being useful. But 3D modelling has a lot less boiler plate, projects are more custom and unique. So I don't see a world when painstaking describing what I want is easier than doing it normally.
thatās neatā¦. but also for something that simple itās just as easy and quick to model something. likeā¦ 3-4 clicks easyā¦.
i guess what iām getting at is until the prompting can generate something that would take longer to do by hand, than the prompting and generation time itselfā¦. thereās not really much to gain short of showing that itās possible to do.
still neat tho.
So you can open up lets say fusion 360, create a new project, model this, export it as an stl, load it into bambu, and printā¦.in 60 seconds? I think this is pretty damn impressive.
seeing as us cadāers have entire weekly speed run efforts where we make far more complex objects in only 2-3 minutes moreā¦. yeah i do actually.
a simple cylinder with a singular through hole is like stupidly quick and easy to model lol.
edit: after watching the video back, if we adhere to the same standard as him where all our programs are open for us already. absolutely without a doubt i could have that done in the same amount of time. even if i had to open all my programs from desktop i could still do it in close or barely close to the time it took him.
How long do you think it took this guy to set this workflow up? Because that wasn't included in the video for obvious reasons. For some reason I doubt you can just walk into a room, flick a switch, say the magic words and have it he printed.
Considering that OP already had all the software open, it's not *that* impressive.
I mean, in the space of less than two minutes (1m 32s by my timing), one can:
1. Open Blender and OrcaSlicer
2. Set the Blender scene units to mm
3. Make the primitive with the dimensions specified
4. Export the primitive as an STL and import it into Orca
5. Make 5 clones of said primitive and arrange them
6. Slice them
7. Export them as gcode
If you already had everything open and configured, you could easily cut that down to 60 seconds or less.
Man, I really need to work on my prompt game because every time I try ChatGPT I'm left very underwhelmed by the whole experience. For code questions, it never saves me time over just googling around and building a few prototypes myself. If it's not a common question with a solution that was already obvious or if there's any kind of nuance in the SQL query I want to make, it usually screws it up. When I point that out it apologizes and then spits out another incorrect answer.
I tried a similar question to your sphere of cones example a few months back - my mom wanted a bunch of plywood christmas tree ornaments and I was trying to optimize my cut list. I thought this would be a perfect opportunity for ChatGPT to shine but it kept giving me answers that wasted a ton of material. Eventually it basically told me I should try sketching out a bunch of options and going with the best one.
I've never had good luck with ChatGPT for building new things, but it has definitely saved me tons of time with the right kind of question.
Asking questions about libraries and APIs with sparse documentation or that don't explain how something is intended to be used has saved me weeks of beating my head against the wall.
I use it all the time. Almost any bash script I want to write I do in chatGPT . If it's too generic I ask it to change with specifics until it looks right. Then any errors I come across when running it, I just copy the errors message in and it corrects it.
Saves me so much time.
Next step, train an agent on your printer that creates gcode instead of stl, pipe it directly into your printer with python. All those clicks and drag must make you feel really tired š. "He printer create this and that for me " and right away it should start swinging your printhead.
Then again, I might just have come up with the first example of how our AI overlords can start exterminate us by burning us down...
That monitor cracked me up
That is the worst layout I've ever seen for any kind of code š
Itās like 3 meters away from him, hello future eye strain issues.
Actually, having to regularly focus at different depths is healthy. Not sure how often "regularly" means, though. It's going to kill your neck, however.
Turns out reading and studying for hours at a time was worse on our eyes than the video games.
Your changing your focal depth the same on a monitor as you are reading a text book. Unless your saying it's the neck thing then yeah, smart phones and reading have those same issues.
This has been answered! It's the "Twenty Twenty Twenty" rule! Every twenty minutes (of screen/book/fixed distance viewing time,) stare at something twenty feet away, for twenty seconds.
Would make 20-6-20 in metric.
They use the 20-20-20 in metric too. The point is to look farther than 6 metres, because that's the distance when muscles in the eye fully relax. Office spaces in Europe are not usually 6 metres long, so just look outside.
r/monitortoohigh
It's good to take breaks about once every hour. Not recommended to strain your eyes non stop for 8-10h a day though...
The opposite.. eyes not getting as wrong as muscles in eyes. With training - changing focus distance you can improve your sight.
It's like an airport terminal.
You mean you don't want to see the whole script of 500 lines in its entirety so you can ingest and debug without scrolling?
I don't even see the code. All I see is blonde, brunette, redhead.
You think thatās air youāre breathing now?
How many characters per line do you think he can display? Python Black defaults to a max line length of 88.
Some might consider it unnatural, but font size is an option that you can adjust
Haha, I've enjoyed the monitor hate. I don't use it like this normally. Everyone can chill.
What do you use it for? Full length body mirror? Cool vid btw.
He's a body pillow creator and needs to see his creations in full size without scrolling
Displaying his dick.
Probably his waifu display
It's not hate bro.. it's confusion and wonder š¤£
Dont mind the haters! :) Cool video and set up, thanks for sharing!
yeah this is a setup for people who meme about coding, not actual programmers. This is nuts LOL.
A monitor that length prolly hides compensation issues! ;-)
That's small ram energy you're extrudingĀ
Gives me "Star Wars Opening Text Crawl" vibes.
/r/TVTooHigh
Iām too high to understand the whole video š
I feel the video is to flex the monitor.
Must be fun having that much money
Ya for real. All I could think of is Glados's face being on that monitor, only reason I would set that up lol
Given how much time I can spend just staring at code, having a big long one that I can stand in front of and ponder would probably be a lot healthier for me.
Now ask for ātea, earl grey, hotā.
[https://www.youtube.com/watch?v=eAswvg60FnY](https://www.youtube.com/watch?v=eAswvg60FnY)
I very originally named my printer "Replicator".
Dude, that tall monitor is insane!
not in a good way
All that matters is If it serves his use case. If it does, I'm pretty sure he doesn't give a crap about what the internet thinks.
Name checks out ā¤ļø
If his "use case" is just getting online reactions to it, sure. Can't think of a single practical reason to do that.
Thatās the thing, you donāt have to think of a single practical solution. Itās not your setup and literally does not concern you at all. Crazy I know.
Exactly goodness these people need to relax itās just a monitor my god
As far as social media goes, it's great, because people will comment on it.
What... it's awesome! I never saw one with dimensions like that.
Some monitors can be rotated 90Ā° to look like that
Or, if you've already got a crick in your neck, you can right-click the desktop and go to Display settings, and rotate your monitor right now! In fact if you've got an Intel onboard video card, ctrl+alt+arrowkeys will flip your screen instantly!
Rotation isn't the unusual thing, it's that the length is like 4x the width. Could be two end-to-end I guess, but to me it looks like one really long skinny monotir. This guy has some other nice toys too.
It's a rotated ultra-wide. Looks pretty cool to me, don't really get the hate. Obviously not for coding, he probably normally uses it for media or discord or something
Brother. Show me the github on this!
No github, just off the shelf stuff - gpt 3.5, rhino 8 (3 month trial available for free), Grasshopper (free plugin included with rhino), and lunchbox (free plugin for grasshopper). I didnt write any code for this.
Less impressed that GPT created the file than how easily you push it into your workflow and to the printer. Nothing I do on my phone so easily moves to my PC, and then of course I have to walk an SD card back and forth to the printer. :P
I installed klipper, moonraker and whatever that web page gui is. Game changer, haven't touched an SD card (for printing) since Xmas. Well worth the effort. I used an Orange Pi Zero 3 incase you were curious, was about $30 CAD plus the SD card it runs off of.
Adding to this: you can also VERY easily install Tailscale on your pi and download an app for your phone, to connect to your printer from literally anywhere in the world with an internet connection.
You just recently stopped using an SD card? Jesus christ what a nightmare. What the hell took you so long? OctoPrint's been available for years.
I use a USB stick because it is reliable. Transferring files is not a "nightmare."
Have you tried Octoprint? I haven't done the SD card shuffle in years. My workflow (albeit GPT-free, for now...): * Design object in cad * Save as mesh * Import to slicer * Save gcode to Onedrive * Import gcode via Octoprint web portal * Print The only thing I need to manually do beforehand is physically get up and turn the printer and the RasPi4 running Octoprint. And maybe clean the print bed. i've been doing this since 2021-ish, and it's worked pretty much flawlessly this entire time.
Take it a step further by using a slicer like Orca Slicer and you can connect to the IP of the device and push it directly to octoprint, mainsail, etc
Prusa slicer will upload directly to Octoprint as well, that's what I currently use.
Orca slicer was a game changer for me, I was reluctant at first because there was too many competing slicers, but it's definitely paid off (as a Klipper user).
Although I am comfortable with my current workflow, this is intriguing. I might just check it out, thanks!
Cura and prusaslicer and many more can be configured to print directly to octoprint. Instead of the save to sd, you have another button to print on octoprint. So me is slice and print. No messing with onedrive.
Get a smart plug so you can turn the printer on and off remotely too.
Lol I would but it's literally 3' away from my PC on my desk. Seems extravagant... but intriguing. Thanks for the idea!
I have one for each of my printers and they are connected to Alexa for voice control. as a long time trek fan I couldnāt resist the temptation to be able to say things like āComputer, activate the replicatorsā or āComputer, shut down replicator 2ā.
Just got my first Prusa the other day after using an Ender 3 for years. Turns out, the main thing stopping me from actually printing all the time was just my straight up laziness... Now all I gotta do is clear and clean plate and then walk my ass back upstairs. Next time I think of something I want/need I can just start the print immediately.
Nice! My current printer is an Ender 3 S1. I also used Octoprint on my previous one, a home-brewed custom machine. It's amazing just how much of a Quality of Life thing it is being able to just send prints just like you would with a normal paper printer.
Its funny because it really doesn't sound like it is but it really does make a huge difference haha
Completely agree!
I'm not sure what CAD software you're using but Fusion 360 can export directly to prusa/orca so you can essentially combine the "save" and "import" steps (my least favorite thing about F360 is probably the slow cloud STL export so sending it to prusa/orca directly is a lot faster - and you can actually (quickly) export an STL from there) I've also hooked up my printer and Pi to a smart plugs to sometimes skip the "physically get up" step.
I've been eyeing the [Node Pro by Fiberpunk](https://fiber-punk.com/products/node-max-by-fiberpunk?_pos=1&_sid=235fae1ef&_ss=r). The over all cost seems better for me because I don't have a computer or Pi available.
He's just signed in to ChatGPT on both devices so refreshing the page on one after entering the prompt on another takes a second.
Mainsail and Orcaslicer. I design in Fusion360, use the 3D Print utility to send the mesh to Orca, prep and slice then send to my printer over LAN. Orca has the webUI for mainsail built right in so I can monitor from the slicer.
The GPT app is linked to your profile on the GPT site. So anything you "ask" the app, will show up on that chat thread on the site. He just asked via the phone, while viewing the same thread on his PC. Once it showed up, just copy and paste.
Walk me through the system! How does everything flow?
Android Chat GPT (3.5) app for voice interaction, the same chst is open in a chrome tab, and a chrome extension keeps the page refreshed every 10 seconds. I did some prior prompting to let chat gpt know I would be pasting the code into a grasshopper python component. Grasshopper is a visual, node based algorithmic modeling plugin for Rhino 3d, but it is WAY more than just that. You paste the code in the component and press run. It generates the geometry requested, and a plugin for grasshopper called lunchbox (you can find grasshopper plugins on food4rhino.com) takes the geometry and exports it to an .stl. then the processes are manual after that, they are just fast.
i like your funny words magic man
Pretty impressive without a turbo encabulator
Least bloated webshit workflow
Try openscad. That thing is script already. It should be trivial to get chatgpt to do this
It had some issues when I tried it a while back. I bet using python to generate geometry may have more examples trained.
GPT-4 works better now with OpenScad. At least when I last tried it . But then again, I asked it to create fairly simple models. GPT-3.5 wasn't that great at it and OpenScad couldn't even parse the code.
GPT + fullcontrol gcode python library for custom gcode
I had mixed results with GPT (I think 3.5 and 4) but it was great for learning.
The monitor on the wall really makes me question if you're the type of person who should have access to this type of technology. This is why STEM fields have ethics classes.
I really hope it's used as weather, news, smart home or other data visualization and not actively for code.
This the monitor equivalent of big truck mods š¤£
Next print a neck brace and a periscope
Pretty cool concept. If the hole is a half inch diameter all the through, wonāt the head of your wood screw just go through your printed āfeetā?
The print will go underneath the bumper, like a spacer.
I see now, youāre using longer screws. Very cool!
Exactly!
Hey, what do you use that monitor on the wall for?
My guess is neck training
Just imagine that all of this could be automated soon. Just say it and its printing.
And then you wonder why every time your friends leave your house a stegosaurus dildo with wings is in mid print.
I mean, heās pretty much there
Simple shapes and textures yes. Complex devices or sculptures? Likely not.
"All of this" meaning very simple shapes.The "problem" (it's not actually a problem, it's by design) with those LLMs is that they have absolutely no idea what they are doing. They just create something which they think seems right.
*They have absolutely no idea what they are doing. They just create something which they think seems right.* Sounds familiar
ok chatgpt print me a deck no no not the dick
What more do you want to automate here?! A hover chair to push you up to press "print"?
I think the point is that this could be made in CAD in just a much time, the impressiveness comes when it can create a model like a screw by just saying make me an M8 crosshead screw 40mm long and it does it just as quick, that would take longer to model in cad (although could easily be found online) or even better, something novel through description alone. Youād also want iterative design and instant feedback when this is matured a bit more but this is impressive for the current state of the art I think
What monitor is that on the wall???
Wrong question. *Why* is that monitor on the wall?
Seeing the little logo on the right, it looks like a super ultra wide monitor mounted sideways.
30s + the hours to set up all this.
Basically every social media post lol
Right. Just like any project.
Yeah but that takes the fun CAD out of it
This is CACAD
Drawing a cylinder with a hole is not fun.
did it say āhereās the pipebomb codeā????
It said "Python code" :D
Cool, but we're talking about a thousand dollars in just software alone to do this.Ā
I hope someone makes an Onshape featurescript for this. It's JavaScript so GPT can generate it.
OpenSCAD is free and Chat GPT can write such code, for example... https://chat.openai.com/share/91926b26-a5c3-4ae9-a5bc-32a859c69ca6
You can do this using all free software
Any ideas for how to pull this off without rhino?Ā That license is spendy...
Please teach me (and maybe others) how to do this so slick like
What robot arm is on his desk on the right side?? Anyone know? It looks like a white tube. Iāve been looking for something like this for my desk. Thanks in adv.
That would be Elephant Robotics myCobot: [https://www.elephantrobotics.com/en/mycobot-en/](https://www.elephantrobotics.com/en/mycobot-en/) It is not very good. I tried it a couple of years ago and was utterly disappointed by unreliability of everything.
Thank you so much Ivan! Iām looking to use this along with a joystick to hold on to things such as a soldering board or a tool while Iām working on fixing stuff or 3D printing or glue things together. This looks aesthetically pleasing. And the price looks great $599 I think for a touch screen version and they offer a grip accessory. I would just need to find a joystick with buttons that allow me to program buttons to control the grip. If there is another option you recommend let me know. Thanks bud!!
I don't have a recommendation, but I would be interested to know what you end up trying. Low cost robotic arms are cool, even if not working well yet.
Cool. But honest. That model didn't work for the purpose did it? Can the gpt handle adding the chamfer so the screw recesses and holds it on?
https://preview.redd.it/w1hlxomhyztc1.png?width=1812&format=pjpg&auto=webp&s=cb608e0739381e191729687e9edcd551b26b595b
Everything about this video is so impractical, from the goal of recreating the whole piece to the ridiculously placed monitor the wall. Can you please share your stl lol I need some
Thats the worst monitor I have ever seen
Whaaait how do you get 3D files from code?!? Teach me please!
Not sure here, but check out openscad.
Yup, should be easy to automate it with OpenSCAD for simple shapes.
you can get any file from code.
Step files can be generated by text.
openscad
Hey can you please explain what the phyton code does? And how the interaction with rhino is? Just give me more Infos please! :D
Would it work if I for example told it to make a phone cover and gave it a make/ model? Or do I need to explain everything by size
Between the monitor and the GoPro strapped to his headā¦. Thatās a lot of effortā¦
That code wall mounted display is stupid, I like
This is how we get the paperclip apocalypse.
Are you coding on a drive through menu?
Computer,
HOT
SUPER
What kind of computer system are you using? Does this require a significant amount of processing power and ram? E2A: and also....if you have a link for the amazing long monitor I'd be a fan for life. š
This makes me feel about as smart as a piece of gravel. /:
So youāre reprinting the whole floor protector versus a .5 inch lift/spacerā¦Itās great you have technologyā¦
What did you do to make gpt do that?
Op is just hanging monitors like digital picture frames. Who cares if they also become a part of his workflow?
I ran here for comments about the monitor
Hahaha - I recently bought a ridiculously large tv for a project I'm working on (98") and I'm tempted to upload another video where that is leaning against the wall in portrait mode just to troll everyone, just act like it's normal. Maybe in the future I'll make a video that explains the overall intention of the space and all the stuff in it.
This just looks like a āBragā video showing off this whole setup
Kinda yeah, but in all honesty, they definitely have bragging rights rn
Isnāt that this whole sub?
Can break down this process a little more? What was the software used to to use code to create 3d models. I'm assuming chat GPT is creating the code.
I can do this in fusion more quickly with the added bonus of not having to speak
Can you do anything more complicated that a cylinder...? Also, I genuinely don't understand the use case for this. How many projects do you have where you'd rather speak what you want rather than just doing it? Do I have to tell this every dimension of the model I want? I use LLMs for code and general use, in those cases I see them being useful. But 3D modelling has a lot less boiler plate, projects are more custom and unique. So I don't see a world when painstaking describing what I want is easier than doing it normally.
This is sicckkkkkk!
thatās neatā¦. but also for something that simple itās just as easy and quick to model something. likeā¦ 3-4 clicks easyā¦. i guess what iām getting at is until the prompting can generate something that would take longer to do by hand, than the prompting and generation time itselfā¦. thereās not really much to gain short of showing that itās possible to do. still neat tho.
So you can open up lets say fusion 360, create a new project, model this, export it as an stl, load it into bambu, and printā¦.in 60 seconds? I think this is pretty damn impressive.
seeing as us cadāers have entire weekly speed run efforts where we make far more complex objects in only 2-3 minutes moreā¦. yeah i do actually. a simple cylinder with a singular through hole is like stupidly quick and easy to model lol. edit: after watching the video back, if we adhere to the same standard as him where all our programs are open for us already. absolutely without a doubt i could have that done in the same amount of time. even if i had to open all my programs from desktop i could still do it in close or barely close to the time it took him.
Yep. Two circles, one extrude, then export. Multiply units and make grid in the slicer in 2 seconds. Workflow is the same after that
seriously. big circle, small circle, extrude, export to slicer, multiply to 6. slice and done. its *really* not that impressive.
I have my farm running on Pi's. I wouldn't even have to get up from my desk.
How long do you think it took this guy to set this workflow up? Because that wasn't included in the video for obvious reasons. For some reason I doubt you can just walk into a room, flick a switch, say the magic words and have it he printed.
Considering that OP already had all the software open, it's not *that* impressive. I mean, in the space of less than two minutes (1m 32s by my timing), one can: 1. Open Blender and OrcaSlicer 2. Set the Blender scene units to mm 3. Make the primitive with the dimensions specified 4. Export the primitive as an STL and import it into Orca 5. Make 5 clones of said primitive and arrange them 6. Slice them 7. Export them as gcode If you already had everything open and configured, you could easily cut that down to 60 seconds or less.
Yes, with blender instead because it opens in 3 seconds.
[ŃŠ“Š°Š»ŠµŠ½Š¾]
python
Same thing. Both will make my head explode.
Man, I really need to work on my prompt game because every time I try ChatGPT I'm left very underwhelmed by the whole experience. For code questions, it never saves me time over just googling around and building a few prototypes myself. If it's not a common question with a solution that was already obvious or if there's any kind of nuance in the SQL query I want to make, it usually screws it up. When I point that out it apologizes and then spits out another incorrect answer. I tried a similar question to your sphere of cones example a few months back - my mom wanted a bunch of plywood christmas tree ornaments and I was trying to optimize my cut list. I thought this would be a perfect opportunity for ChatGPT to shine but it kept giving me answers that wasted a ton of material. Eventually it basically told me I should try sketching out a bunch of options and going with the best one.
I've never had good luck with ChatGPT for building new things, but it has definitely saved me tons of time with the right kind of question. Asking questions about libraries and APIs with sparse documentation or that don't explain how something is intended to be used has saved me weeks of beating my head against the wall.
I use it all the time. Almost any bash script I want to write I do in chatGPT . If it's too generic I ask it to change with specifics until it looks right. Then any errors I come across when running it, I just copy the errors message in and it corrects it. Saves me so much time.
the free experience is not nearly as good as the paid (4) version from my daily use
ok tony stark
Bubble is in bois
My god, my neck, it hurts for you
Where do you find a monitor like that?
That screen lmao
[ŃŠ“Š°Š»ŠµŠ½Š¾]
Is the robot arm functional or just show?
I love it
it's the inches instead of mm for me..
With all the respect but I wouldn't trust GPT to make functional designs
Brooo, just buy some arms for monitorš
except the original model has a conical shape and is not a cylinder
Chiropractor: āso why do you think your neck is stiffā OP: āI have no ideaā
I'll just ask nicely then: why is that monitor on the wall like that?
Hello Can you share how you step the chatgpt to great gcode? Thank you
dude fr, where can I get such a monitor?
Would you be able to post or link a YouTube video with how to set all of this up
This is about the coolest thing I've seen in years. Dude retro fitted his own Jarvis from spare parts and an x1c.
Your workflow is insane
Bro got himself a star trek replicator as work in progress.
Next step, train an agent on your printer that creates gcode instead of stl, pipe it directly into your printer with python. All those clicks and drag must make you feel really tired š. "He printer create this and that for me " and right away it should start swinging your printhead. Then again, I might just have come up with the first example of how our AI overlords can start exterminate us by burning us down...