What's new
What's new

Would you guys go for a gestural interface for CAD/CAM?

I'd think it has a way to go for CAD/CAM. You're waving your finger, unconstrained, in 3D space; but the results show up on a 2D screen. Seems that a pen interface would be way better for 2 1/2D applications and better for 3D as well as long as rotations are easy. The pen apps are already here.
 
I think I would want a footswitch or something to enable it, so that stuff isn't flying around like crazy if I swat a fly, sneeze, or scratch my ear.

Chip
 
For really complex parts or assemblies I think i would be exhausted by the time i was done. I think It would be more efficient to use devices that track eye movements and use couple of finger controlled buttons to select vs moving a mouse or track ball around. I know they have similar devices for quadriplegics but It might take time to make them accurate enough to allow you to select between to objects on a screen that are very close to one another.
 
yes, definatly.
many months ago, i compared how 'stupid' cadcam software was compared with a human.

if i said to you,..
get an ally block,
block it up to 6" x 4" x 1/2" thick,
ground top and bottom,
and one datum edge,
bang in a thru hole, m8, about 1/2" in from each corner,
and a 5mm start hole for a wire eroder, centre about...
(see, i also switched from imp to metric with no problems)

... i have in my mind what i would get.

Now do that lot in cadcam.
you have to give it every bit of info.
because you have only a mouse and a keyboard to communicate.
******
having said that, i have just bought a tablet to read up and refresh my 5 axis skills.
with no mouse, what a pain in the butt that is.
******

as soon as pc's can understand conversations, and 'think' rather than just crunch numbers.. it will all change.
this is a long way off though, having just read 'Physics of the future', PC's will not dominate over grey brain matter for some time yet.

Physics of the Future - Wikipedia, the free encyclopedia

still scary stuff though.
 
There was an article on Slashdot recently about a programmer who went to Dragon Voice Recognition due to carpal tunnel problems. He got efficient enough at it that he kept using it after he was better. The trick was he developed his own "language" for it to interpret, that was not necessarily English. Closer to R2D2 stuff, I guess. But it was unique enough to be unambiguous. He must work alone, though.

"Snert 15.125 gobble snert spizwilly .02 diameter" would get pretty old in the cubicle farm.

X-keys programmable keyboards work out well for some of my work software. You can write huge macros and call them with a physical key on a separate keyboard that you can label graphically or by text. Lots of programs do macros, of course, but keeping track is the "key".

It seems Kinect or Leap type interfaces would be good when someone else wants to look at the wireframe or model, for instance. They could quickly learn "turn this way to see the back" and stuff like that, without turning them loose on the keyboard/mouse. You could also keep them from doing damage by limiting input.

Chip
 
Used with a tablet or digitizer yes but I don't like how they work when used with a touchpad on a laptop. I used autocad for a few years in the 90s with a 12" digitizer that had a template you could use to pick off at least 100 commands. After a while it took very little effort to find things and was really fast. One thing I could imagine would be a mouse that had a cellphone size touch screen on top that could present different options. The poster above who said that the fundamental process is too hard has it right. Gestures are just lipstick on a warthog
 








 
Back
Top