Does lidar see Z?

In prep for a possible feature request post…

Does the lidar only “see” X and Y, or can it see multiple layers in the Z direction?
XY= sees a square that’s 1mm x 1mm
Z= it sees the above and that it’s 1.5mm tall across x layers

Thanks!

The laser cutter lidar sees Z.

1 Like

He was asking about the X1C. @kungpaoshizi, the lidar does see Z since it has first layer detection where it can tell if something is the first layer is above or below where it should be.

3 Likes

The next question would be what’s the allowable mm depth perception?

Do you mean how far it can see or the precision?

1 Like

Oops! Thanks BB!

The X1 it is not really lidar, maybe faux-dar, camera-dar, fake-dar? Marketing-dar?

It only “sees” Z in a certain, limited context. It’s not like lidar at all in that aspect.

If you supplied more details as to what you are trying to do maybe I can answer more precisely.

2 Likes

So we have a certain degree of dimensional accuracy that’s had by the lidar…
I just realized, after having a picky petg and having to actually run the calibration vs the print-start calibration wasn’t working with it, that gives me an idea.
I see my retraction settings pop out as a secondary issue.
Why not run calibration of say 5 lines, 1 line is 10mm with a 2nd layer that’s 5mm, repeat for the other 4 lines.
This would alllow for adjusting retraction settings 5 ways, then seeing Z where the retractions happen, and picking the best retraction setting depending upon whether retraction or detraction is poking above or below the “point of flatness” where the top of the layer should be. If that makes sense.
This could also be used to analyze Z hop in a similar fashion.

I hear that for the first time. How do you know? And what is the difference to a true Lidar?

I sense we have a defensive P1S user since the term ‘marketing-dar’ came into conversation… :stuck_out_tongue:

The lidar is present, and there is a secondary camera as well iirc. This is why you see the red line, then you see the light turn on from underneath.
Even if we give the lidar crappy human vision resolution, it still does better than a human. But it’s to the micron level, which our eyes don’t do.
So yes, I’m used to the biased hate from voicing my appreciation of the lidar.

The lidar was the whole reason I bought the X1C over the the P1S (which is just another printer)
I will NEVER, ever, ever, ever, ever, ever, ever buy another printer without the lidar or something similar to replace my aging eyes and patience.

1 Like

Because I have an X1 at home to play with at home and I’m an engineer.

Lidar is a photon time-of-flight technique.

Bambu X1 faux-dar is at best a structured light disparity/triangulation system. At worst they are just using it as sort of enhanced background removal/image segmentation system. It bears absolutely no resemblance, at all, to lidar.

On the laser cutter they also claim to have lidar, I don’t know how that one works since I haven’t seen it, but it definitely measures Z. Probably still structured light but it could be a TOF sensor.

LIDAR = Light Detection and Ranging. If it says it’s LIDAR, by definition it measures distance in one dimension only. LIDAR does not measure X/Y position at all (unless you re-aim the sensor).

If it also measures X/Y, it’s not LIDAR. LIDAR systems work by measuring Time of Flight, only.

Structured Light sensors are not at all similar to LIDAR sensors, except that they emit light that is used to measure distance. But they work in completely different ways.

It would totally incorrect to refer to a Structured Light sensor as a LIDAR. So if the BBL documentation says “LIDAR” it’s a time-of-flight sensor.

1 Like

I’m presuming you are replying to me, but if you believe that just because Bambu Lab calls it lidar it is lidar, I hear they are also looking for investors for their new beachfront development in Guizhou province.

A simple range-finding LIDAR sensor is both cheap and accurate. I have no trouble believing BBL. It’s not something that’s worth lying about.

1 Like

I don’t want to speak incorrectly, so maybe I’m wrong.

Here is the lidar breakdown page. Point out where you think the TOF sensor is and I’ll go look for it.

There are different kinds of lidar, and different acronyms for it. It’s funny you only brought Light Detection and Ranging to the conversation when it can also mean something about oh I don’t know, laser imaging? Either way it’s measuring on a micron scale, better than any of our eyes. PERIOD.

Please leave and stop derailing my conversation because you believe to know it all because you’re an engineer. I’m an engineer too but I would rather hear from people talking about the technology AND WHO KNOW, than listening to someone bored trashing some random person’s post on the net.

The device labeled “Micro LiDAR Camera” is the sensor. The device labeled “Micro LiDAR LED” is the emitter. The emitter pulses, light is emitted and bounces off the build plate to reflect back to the camera. The round-trip time of the light signal is used to compute the distance between the emitter/camera and reflecting surface. There’s a little math, besides the obvious divide-by-two, to get the distance because the incidence angle of the emitter and camera with respect to the build plate isn’t 90º, the distance is a little bigger than if they were perfectly perpendicular.

Both the LIDAR camera and emitter are clearly identified in images of the underside of the printhead on that WiKi page. I’m at a total loss to understand your confusion. Or why you’d think BBL would have any reason to lie about it…

1 Like

Thanks for looking at the page.

I’ll go and fire up a cal and see what it does because I don’t actually know what the Micro Lidar LEDs do. I suspect they are just normal incoherent LED’s and therefore would be useless for TOF, but I’m happy to look, sometimes I am wrong.

But since I work on X1plus, I do know what the lidar camera does. It’s just a normal camera that captures normal images, we’ve got dumps of them. In fact when you upload a log to Bambu it includes images from that camera.

Do you have any other candidate aperture you’d like me to look at? It could be very small since it only needs to have an APD behind it, but I would still expect a lens of some sort.

We use tiny, integrated SMD TOF sensors in my product, but they wouldn’t work in this application, they would just tell you the distance to the build plate, it couldn’t actually discern any filament (except in bulk).

Coherent light is not required for TOF. It would be for Doppler Interferometry, which can also be used to accurately measure position (and velocity). All TOF is doing is measuring how long it takes light from an emitter to be seen by the detector. Any old light will do.

Assuming it’s a “global shutter” sensor, the emitter is pulsed and an exposure is made around the “expected” reflection time. The distances being measured are small. It might be that the intensity of each pixel in the camera image’s correlates to differences in distance.

I don’t know how BBL is doing it. But it’s not stuff that’s particularly hard to do.

2 Likes

It could be a doppler interferometer with a camera used to read the interference, but it’s missing an import thing. There is no beam splitter. And you are right, a normal camera could be used to read the fringes.

As to whether it could be done with “any old light source”, that isn’t true. That would make it impossible to detect the interference. Because lasers are cheap and work well, of course you’d just use a laser. It is possible to use incoherent sources, in fact I have one on my bench that would work, it’s called a SLD. It has the unusual optical property of having a short coherence length that make it desirable for this sort of application. We used them for making a type of measurement related to a Schlieren but simpler, called a shadowgraph. Anyway, their controller is about the size of a brick and they costs a couple kilobucks, I guarantee you there isn’t one in a X1. They’d just use a laser.

So far, no signs of a lidar.

Sorry, missed this part.

Unfortunately global shutter cameras don’t have this kind of extreme shutter speed, you’d need picoseconds. Frame times for the sort of consumer camera in an X1 are measured in milliseconds.

Here’s a picture of the SLD controller. Not going to fit in an X1.

I said it wasn’t a Doppler Interferometer.

Your definition of LIDAR is overly complex. LIDAR in its simplest form is simply measuring the time between the emission of a pulse of light and the reception of its reflection. There’s nothing that says it has to be coherent light.

There are many examples of LIDAR sensors that are basically just an IR LED, an IR Photodetector, and an inexpensive Integrated Circuit that’s designed for exactly this application.

Like I said, I don’t know exactly how BBL’s system works. But there is no reason to believe it is not a LIDAR, either from a technology standpoint, the visual evidence, or from a Marketing motivation perspective. Why would they make a big thing about something that could ultimately be shown to be bogus?

Not seeing what you expect to see doesn’t constitute proof it’s not what it claims to be. Take your printer apart, figure out how it works, show that it’s not a LIDAR, that’d be proof. Otherwise, you’re just speculating like I am. I would argue it is more likely that speculating it is actually LIDAR is correct, than that it is not LIDAR. But there’s no value in arguing that, either. So I guess I’m done with the discussion unless you offer proof of your assertion. :slight_smile:

But wait… I asked AI.

1 Like