The introduction of digital technology in the 1960’s and 70’s helped pave the way for numerically controlled (NC) machines similar to those we now use today; CNC (Computer Numerically Controlled) machinery.
NC technology was originally invented by John T. Parson in the 40’s working closely with the MIT (Massachusetts Institute of Technology) which was later commissioned by the United States Air Force. It helped provide a much more cost effective way to manufacture parts for aircraft, especially those with intricate geometries. Within industry, this technology eventually became standard.
However, advancements didn’t end there as we all know. In 1967, computer-controlled machinery began to be explored as a further concept. Computer Aided Design (CAD) and Computer Aided Machining (CAM) marked prominent developments within CNC machinery in as early as 1972. And by 1989, these new CNC machines set a new standard within industry. But what were the differences between NC and CNC?
The original NC machines were controlled by cards that had a set of codes to program them, which were called G-Codes. They provided machinery with positioning instructions. The issue with these lathes however, was due to the hardwiring, which made it impossible for the operator to change the pre-set parameters.
CNC technology resolved this issue by designing, conducting and controlling these codes through computer systems instead. In modern day technology, these codes have been combined with logical commands to enable the operator to make adjustments in real time.