I’m printing a 20x20 mm calibration cube using PETG HF filament on my Bambu Lab H2D printer, but the resulting dimensions are slightly off, coming in at around 19.9x19.9 mm. I’d appreciate some advice on how to improve dimensional accuracy. Are there specific settings or calibration procedures recommended for the Bambu Lab H2D?
Thank you! You’re absolutely right. It’s strange that the shrinkage of the PETG HF filament isn’t included in Bambu Lab’s standard profile, especially given that this filament is also produced by the same company.
0.1 mm, wow… tough crowd. Might want to take up CNC machining. It will only cost you $5,000+ to do better. Yes, I’m joking, but only a little, it would cost you around $4,000 to get a machine that can consistently hold better than .004 of an inch (0.1mm).
But if you must have better, I think Bambu made the Vision plate for you.
No, the vision plate will not magically solve all your issues, it will not correct for shrink.
The vision plate is solely for calibration of the XY range of motion for whole-plate accuracy to “below 50 microns” (0.050mm).
Reminder again, that’s for motion in XY, it has nothing to do with extrusion, build plate adhesion, loading the wrong filament, wrecked nozzles, wrong nozzles, failing to measure and account for offsets, and so on. It does not cure operators lack of practice and attention.
Also in response to just4memike: 0.1mm is easy business on these machines (my opinion anyways). It’s even easy business on a manual machine, let alone CNC machines, no need to pretend that’s hardcore accuracy (sub 0.01mm is hardcore accuracy).
If you are trying to hold 0.001 of an inch, you need to sort out with the XY motion, extrusion, and shrinking. I’d add skew as well, but I’m not sure that is adjustable as of now.
Don’t just dip your toe in accuracy… dive on in. It’s one crazy rabbit hole, that I honestly don’t think it’s worth hassling yourself over (for 3D printing). But to each their own.
Just4, why are we mixing units now? One thou (0.001") is a quarter of 100 microns (0.1mm).
Also, one thou is half of the 50 micron repeatability being quoted with the vision encoder, so while not unattainable, YMMV (where mileage equals yields within spec).
My recommendation is not to dream about diving into crazy accuracy/repeatability requirements from a consumer desktop 3D printer of any kind, or anything that’s spitting out molten plastic for that matter.
Just my two cents as a 20yrs machinist in optics, medical, dental, electronics, aerospace, ITAR.
You’re killing me. Honestly, I’m not disputing the statement above, but it seems to be odds with your previous statement of 0.01mm is a real statement of accuracy (which I agree with). Holding tenths (imperial) on a 3D printer is an exercise in futility (0.01mm is holding 4 tenths or 4 ten thousands of an inch for the uninitiated). Your not gonna accomplish that on a consistent basis. You may hit it here and there, but the level of effort would be prohibitively expensive. Not to mention most people don’t even have the tools needed to measure that level of accuracy.
I’m in “Team 0.1mm is fine” on a 3D printer. For God’s sake like you mentioned, its plastic being spewed from a 0.4mm hole at a few hundred of mm/sec. Additionally, every change of material, means you’ll need to climb down that rabbit hole over and over. Its just not reasonable. And that’s not even mentioning the expense it takes to even measure it correctly.
I’m just a hobbyist machinist, none of my home built machines could hold better than 0.002 of an inch in aluminum, so I wouldn’t try to school someone who’s been doing it for 20 yrs. But I don’t think people should look to try to get crazy levels of accuracy out of these things, but again, I don’t want to tell people what to do with their time and money.
The guide and the files are not optimal in my opinion. It is very difficult to take the measurements without human error. Then, the part should be as big as possible so that you still can measure it. With typical calipers, that is 150mm. So a part smaller than 150x150 is giving up accuracy without need. Not as bad as a 20x20x20 calibration cube but still far from optimal. Then it only takes outside measurements for shrinkage compensation. if you have even slight over or under extrusion, you will calculate the wrong values. Only with inside and outside measurements you can cancel the influence from extrusion deviation.
Oh, I haven’t posted the link yet, that was in another thread. So here too:
There went a lot of effort into the model. And I try to explain in detail, which pitfalls there are and how to avoid them.
Of course, 0.1 mm isn’t much. I also tested squares with sides of 40 mm and 100 mm. For the 40 mm square, the actual measurement was 39.8 mm, and for the 100 mm square, it was 99.6 mm. After that, I changed the ‘Shrinking’ parameter to 99.6, and the result was good — the deviations were ±0.02 mm on all test sizes. However, for holes, this somehow didn’t help. I couldn’t achieve an acceptable result, and even using the ‘Auto circle contour-hole compensation’ parameter didn’t help. And the ‘User Customized Offset’ parameter seems to have no effect at all.
Yes, holes are really problematic and I don’t know of any slicer that has achieved good compensation of hole sizes.
X-Y hole compensation doesn’t cut it. 1.) the deviation varies with hole size and 2.) open holes are affected as well but hole compensation ignores them.
I’m running a test currently to identify how hole shrinking depends on hole size. Bought a set with 100 precision steel rods from 0.3mm to 5mm and try to find a formula. Maybe I find something that could be implemented in slicers.
For now, I think we have to resort to adjusting the design based on test prints.
I’m a bit puzzled. I have a 50 mm square: if I reset the “shrinkage” parameter to its default value (100) and print it with the same Bambu Lab PETG HF filament, the side measures roughly 49.8 mm - OK.
Yet when I switch to the Arachne wall generator with its default settings, the side comes out almost perfect at 50.00 mm × 49.99 mm - even though no shrinkage compensation is applied via the “shrinkage” parameter.
I am also feeling a bit disappointed with the dimensional quality of the prints. I consistently have a deviation on the XY axes of 0.25 to 0.30mm on small prints. With my Prusa MK4S, I have an average deviation of 0.05mm, practically always perfect. I don’t want to manually adjust the contour compensation in the slicer for every print. I have invested a lot of money to have a machine with consistent quality and precision over time. I will conduct further tests in the coming days.
Absolutely. The effect is definitely independent of regular shrinking.
If you print a bunch of differently sized holes in the same part, you will see, that small holes are much smaller than designed while medium sized holes are only slightly too small. In my first test with PLA, the smallest hole that was open at all was 1mm. It printed as 0,5mm. A 0,8mm hole was completely closed. a 2mm hole was 1,7mm and a 5mm hole was 4,85mm. That was with shrinkage already compensated.
I have bought a set of precision test pins from 0,3mm up to 5mm in 0,05mm steps, with 0,01mm precision. It is really difficult to measure the true size of small holes otherwise.
After the first test, I now print holes in narrow steps (0,1mm steps up to 2mm, 0,2 mm steps up to 3mm and 0,3mm steps up to 5mm). I will also repeat that with different materials. In the end, I hope to find a formula for hole shrinkage, so that you only need a single measurement for each material that allows to compensate holes of all sizes correctly.