Jump to content

Search the Community

Showing results for tags 'intelligent actuator'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Support

  • Technical Support Forums
    • Cognex
    • Collaborative Robots
    • Epson
    • Intelligent Actuator (IAI)
    • Mitsubishi Electric
    • SICK
    • Other Vendors

News & Blogs

  • News & Announcements
  • Products & Technology

Categories

  • Cognex
    • 3D Vision Systems
    • Dataman Barcode Readers
    • Deep Learning Systems
    • In-Sight Vision Sensors
    • In-Sight Vision Systems
  • Collaborative Robots
    • Kassow Robots
    • Mecademic
    • MiR Mobile Industrial Robots
    • OnRobot
    • Precise Automation
    • Robotiq
    • ROEQ
    • Schunk
    • Universal Robots (UR)
  • Epson
    • Robots
  • Intelligent Actuator (IAI)
    • Positioning Controllers
    • Programmable Controllers
    • Fieldbus Protocols
  • Mitsubishi
    • Controllers
    • Drive Products
    • Robots
    • Visualization
    • Other Mitsubishi Products
  • SICK
    • Identification & Vision
    • Measurement & Ranging
    • Safety
    • Sensors
    • Other Sick Products
  • Other Vendors
    • Advanced Illumination
    • Beijer Electronics
    • Captron
    • CCS America
    • Crevis
    • Ellitek (Data Commander)
    • Flexibowl
    • GAM
    • HMS (eWon/Anybus)
    • Moritex
    • Oriental Motor
    • Pepperl + Fuchs
    • RFID Inc.
    • Robotunits
    • ROLLON Corp
    • Schmalz
    • Schmersal
    • SHIMPO
    • Smart Vision Lights
    • Tri-Tronics
    • WAGO
    • Wittenstein (Alpha Gear)
    • Other

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


About Me


Company Name

Found 8 results

  1. With Robotics expanding rapidly in the industry, the need for simple, easy to set up grippers for Robotic End of Arm Tooling is more important than ever. Intelligent Actuator makes some of the industries best electric actuators. With simple setup and unsurpassed quality, their RoboCylinder line of products has been an industry leader for decades. Now IAI has released a new ELECYLINDER Electric gripper. These 2 position actuators with built-in controllers replace air actuated cylinders and are set up with just 4 easy steps: They can be configured with the TB-03 teach pendant or IAI's new IA-OS software. They can even be configured wirelessly. These cost effective grippers come in 4 variations from 10mm and 28N of force up to 20mm and 360N of force. Best of all, they act just like a 5/2 valve with built-in sensors, with a simple Backward or Forward output and returning Backward and Forward signals, two digital inputs and two digital outputs control and monitor the gripper with ease. For more information and to get the catalog, visit IAI's website: https://www.intelligentactuator.com/ Or go Directly to the ELECYLINDER Gripper announcement here: https://www.intelligentactuator.com/elecylinder-gripper-type-catalog/?awt_a=AREc&awt_l=IBx9y&awt_m=3ary._bPZtxraEc
  2. Version 1.0.0

    29 downloads

    The attached file shows how to connect a Cognex In-Sight camera to an IAI SEL controller via TCP/IP to send position information and other information from the camera to the IAI SEL Controller. Please Note! With the new RA/SA/RAX/SAX/RAXD/SAXD and RSEL controllers. The IP address and subnet of the controller needs to be set in parameters 172-179 instead of 132-139 as indicated in the document.
  3. Picking the right actuator and determining the pitch required to meet cycle times and carry the desired load can be tricky at times, but not if you are using IAI actuators. By using the cycle time calculators created by Intelligent Actuator which can be found on their website linked below, you can quickly and easily figure out the right actuator for your application. https://www.intelligentactuator.com/cycle-time-calculation-software/
  4. Let’s say you are trying to implement a pick and place application with your robot. Industrial robots are amazing in terms of going to the place they were told to go. But what if that place we told them to go changes constantly and we don’t know where the part is going to be next time around. That’s when we use machine vision’s help to guide our robot to the right pick location. The general idea is that a vision system needs to be looking at the potential pick locations, and tell the robot where to go and pick up the next part. I’m sure a lot of you would agree that communication is the key to success. That is no different in this case. If two people are speaking different languages that conversation is not going to work great. In digital cameras, there is a sensor that collects the light from the outside world and converts it into electricity. The sensor has “points” (or you can call it a grid) on it that are called pixels. The images we obtain from the camera are represented in these pixels. Robots on the other hand, have coordinate systems. And they usually get represented in meters or millimeters. Two different languages... The whole magic is to be able to know where the camera is located relative to the robot end of arm tooling. Camera can either be mounted at: The end of arm tooling A stationary location If the camera is mounted at the end of arm, we need to know the location of the camera relative to our gripper. At this stage, we only need to know the relative location in terms of two dimensions. The third dimension, we usually can control. For example, if the camera is mounted 3 inches in X and 1.5 inches in Y away from the end of arm tooling, and our part is in the middle of camera’s field of view, the robot needs to move -3 inches in X and -1.5 inches in Y (on its end of arm tooling’s coordinate system) to be able to grab the part. Wait a second, how about the Z? In the robot program, I always have a set location to take my picture before the pick so I know how far the robot end of arm tool is relative to my parts in terms of Z. But what if the part is not in the center of the field of view of the camera, the camera needs to report the part’s location somehow to the robot right? Yes, and that’s when the calibration comes into play. Basic idea is to calibrate the camera’s pixel readings into robot coordinates. Since most of the time, the robots work in real world coordinates (mm or m), you can use the built in calibration function on your camera software if it has one. These routines usually require some type of a grid with known size squares or circles so the camera can do math to convert it’s pixels to real world coordinates. I usually don’t do that. I would have to make sure my axes are aligned perfectly and would have to take lens distortion into consideration somehow. I make a randomly marked paper or plate (see the plate below with holes in it) and use that as the calibration grid. Here are some simple steps and tips to get the calibration process working; Jog the robot to the location where the picture is going to be taken and take a picture. Make sure all the markers on the plate are visible on the image and the top of the plate is the same distance away from the camera as the top of the part you are eventually going to pick up. Note all the locations of the markers in pixels from the camera software. You will need to implement some tools in the vision system to be able to get location data from these markers. Jog the robot to each of these markers and note down the locations in robot coordinates. All you need to do now is to match the pixel location of the marker and the robot location of the marker and do math! Cognex Insight cameras have a built vision tool called N-Point calibration to do this very easily. You select the markers on the plate and write down the corresponding robot coordinate in a table. The software takes care of the rest and your location tools (like PatMax) will report back in robot coordinates now. Almost the same process when the camera is mounted on a stationary location. You just need to know where the camera is located relative to the arm. Only other thing to be careful about now is that the camera needs to be farther away from the pick area so that the robot arm can swing in and grab the part without hitting the camera. Understanding the basics, it’s not so scary to do a vision guided pick and place anymore. Does it still scare you? If so leave a comment below or reach out to us at https://www.gibsonengineering.com/. Disclaimer: This blog post is valid for using 2D cameras for doing a pick and place of the same type of part. Depending on the parts and the hardware used, some additional steps might need to be taken.
  5. Version 1.0.0

    8 downloads

    The attached file was created by Intelligent Actuator (IAI) and demonstrates how to use Ethernet/IP with your SEL type controller (XSEL, MSEL, etc).
  6. Version 1.0.0

    7 downloads

    The attached file shows how to use Ethernet/IP with your CON type controller. It includes AOIs for RSLogix and two different documents.
  7. Version 1.0.0

    4 downloads

    The attached file shows how to connect a Red Lion Graphite series OIT/HMI to an Intelligent Actuator (IAI) PCON controller using Modbus RTU.
  8. Version 1.1.0

    11 downloads

    The attached file shows how to connect a Mitsubishi iQ-F series PLC to a Intelligent Actuator CON type positioning controller via Modbus RTU. This is a cost effective solution for on-the-fly position adjustment of a CON controller without the addition of fieldbus protocols to the IAI CON controller.
×
×
  • Create New...