Vous pouvez modifier votre choix à tout moment, mais il est possible que les produits de votre panier, les produits sauvegardés ou vos devis soient supprimés s'ils sont indisponibles dans le nouveau pays/région sélectionné.
Why is it important to consider sensor size in addition to optical solutions when designing an imaging system? Choosing the right sensor size and lens combination is critical to obtaining the desired results. Gregory Hollows, Director of Machine Vision Solutions, discusses the variety of sensor sizes available, from a 2mm horizontal to a 90mm horizontal, and their relation to field of view.
Hi, I am Greg Hollows and welcome to the Imaging Lab. We are going to talk about Sensor Size. You might say to yourself, “why is Sensor Size important instead of optical discussions,” but for imaging, it is actually critically important to choose the right sensor and lens combination together to get the desired results. Over the last many years, sensors have come out in a variety of sizes ranging from anything as small as a sixth of an inch format, which is about two millimeters across, up to something as large as ninety millimeters in size. Those are wildly difference sizes for sensors, and you can have very adverse impacts on your system if you don't think about that when you are actually putting your system together. Let's take for a second here and try and get an understanding of why this is important. When you image through a lens, you are going to get a round Field of View something like this CD case, a CD that I have here that I just took out of the case. Now if we have a sensor size that is appropriate for it like the back of this box, we'll notice as we put the image and the sensor over top of one another, you get sensor coverage all the way around because the image circle was large enough. Now for a second, if I go ahead and take that case that I had before and change my sensor size and it's the same size as that case and I put my image over, you will notice that there are black areas of the case actually exposed with this image circle that's here. I won't get full coverage and not maximize what I'm doing for my application or the sensor that I have chosen. One of the big problems that we have out there right now, that we need to understand, is that there is a variety of resolutions out there. As we add more and more pixels, sensors have a tendency to grow, and that can make it difficult for choosing the right lens. You just can't simply plug one that was on one system into another and expect to get all the desired results. The other thing to understand here too - again we'll go back to our example - as the sensor size changes, let's say this box is half the size that it is now -we cover it up and turn it this way and that is our new sensor size. The amount of the image that we're actually covering that was usable is changed. It is going to affect the amount of Field of View that we're actually able to be seeing in the system. Critically important. So varying sensors will actually vary the Fields of View that we get. If you refer back to some of the other parameters that we have talked about, Field of View is one of those critical things that usually ties back to resolution and the amount of object we can see. It's a real disappointment to put a system together and not actually see what we want to get out of the system. Sensor Size is usually defined on an imaging system instead of lenses with a maximum sensor coverage that it can handle. Usually in a format of a half inch, third inch, two-third inch or some physical dimension size. It is important to verify that with the sensor that you are using and understand that the larger the coverage of the lens, it can handle everything that is a smaller size than that in the system. That's Sensor Size. Next, we are going to talk about in more detail discussion on resolution and contrast, or you can click any of the links that are most desirable to you.