NEW IDEA: Camera + filament samples + AI swap

There does not appear to be any place to theorize about solutions to shortcomings in the printer. Swapping and purging is not efficient and there have been lots of options devised over the years.

I wonder if a holder of samples of the filaments loaded in the ams, could be photographed by the printer and then the printer using AI can determine the amount of purging from one color to the next. So given black and white, it would start with a picture of the black and white. It would purge from black to white until the white matches the white original picture and then purge from white to black until the black matches the original picture.

2 Likes

Nice idea. I don’t think the printer itself has RGB functionality in the cameras other than the one in the X1C for viewing/timelapse. I wonder if there’s a way to calculate the flush required from a photo of the filament, or as you say a simple sample print. We could use the Handy app as the mean to do this as most smartphone camera are pretty sophisticated these days.

based on greyscale it should be able to do that ( in some sense )
but it depends what picture/resolution the nozzle cam use
based on the QR code size it should not be a big resolution

so to do that the printer need to make lines and check when the color is back normal or fully swapped
but this only works with black/white as the camera can not compare to a “full” picture as greyscale is only 256 colors ( if not 16 )

so the way i was thinking is that it would draw a line and see how much bleeding it has or where the bleeding stops ( black/white example )
then it should store that certain value for that certain filament, so if you use a other color it mess up the value’s again and need to be repeated, and it is unlikely that this way will work with colors as on greyscale those colors can look the same ( depending on greyscale depth )

anyway , that just what i think :smiley:

There are people adding those inner-ear microscopes to their printers to watch the plastic flow from the nozzle. The whole process would get very tricky when trying to switch between two similar colors, a light versus a slightly darker orange.

you can manually change the purging amount from the “Flushing Volumes” button to the right of Filament above AMS slot 2. People on the Discord and I can confirm that changing the Flush Multiplier to 0.6 then hitting Auto Calibrate works good.

0.46 is working great for me.

1 Like

I don’t want to do anything manually. The purpose of this discussion is to devise a way to let it do it for us.

so that’s the issue though, every filament is different, every brand is different, each one has more artifacts then others, there is too much to test so a higher default value is better but you have the option to lower it, there not going to make it a lower standard value, that’s why they gave you the ability to lower it

color plays another factor, yellow to black to yellow is not yellow unless enough is purged out of the nozzle to get rid of all the black

I love photographic research as it is huge in the automotive industry or rustbelt where I live in developing vehicles so I can speak to it. You will need a couple of things to make the camera system aspect of this AI system reliable and repeatable in sampling the filaments.

  1. Lighting control: First things first, you will need to get the lighting of the printer’s lighting environment under complete control. For this, you will want to have bright and neutral lighting, preferably via low-tolerance color temperature LEDs with bright nit rating. LEDs would need to be positioned such that lighting is even in intensity and color temperature across the entire print bed. This will set the stage for the printer’s local environment’s lighting, which will be crucial for the next step.

  2. Color calibration: You will need something like a calibration card containing several color codes/swatches. You would place this card within the printer on the print bed during initial startup so the print head camera (the bed camera would not be close enough to get an accurate reading) can calibrate itself to true or reference neutral (established from it sees on the card) across the visible color spectrum. This would be manual but it will be a one-time calibration you would do at first start-up only. Once the camera is calibrated after initial print setup, it is good for the entire life of the printer.

Concerns and advisements: Users would need to be made aware of the limitations and requirements of this filament color detection system so they do the right things to ensure it always works properly.

  1. Limiting ambient light entry: The current model printer already has strong tinting on the door glass and top glass hood which limits to an extent the amount of ambient light. However, I would say enough ambient light can still pierce through the top glass necessitating a blind or opaque cover to prevent light entry. The door is less of a concern but if a printer room is very bright, an optional cover for the door might also be necessary at times there too. To save on expenses, users could simply turn off the printer room’s lights off so the room is dark enough to ensure color detection is accurate within the printer.

  2. Not allowing user lighting mods: As expected, user mod lighting systems could no longer be used since they would pollute the color balance and light intensity, as well as its evenness across all areas of the print bed.

2 Likes

and when I complained on discord about purging I was not provided that information. I really don’t see how anyone can think this noise-filled everyone-can-say-anything as long as it does not use the word “stupid” yet is insulting, is valuable. It’s like trying to read the mind of a person with multiple personality disorder.

I like to think of new applications and invent new machines. The one I’ve been working on for the last couple of years is likely going to be my legacy.

If we limit ourselves to the inner-ear endoscopes, it already has fixed lighting and range. As long as ambient light outside is static, it might not matter. Slight variations in color in a particular filament might cause grief. Also having a snippet of filament from the spool, versus a strand of filament from the nozzle could be hard to match.

1 Like

the community is new and learning how this machine operates, your diving in with us, it will take time for us to all full understand, some more then. others to help out the new guys

1 Like

Human eyes are much more sensitive than a regular camera in terms of colours. I doubt that the camera is enough.
The current purging calibration is already based on the similarity of the colours in scales, which is already very effective and efficient.
图片

1 Like

No it is not effective. I used the autocalc and was greatly disappointed. It is very wasteful. Asking for assistance on discord only got me spoken to like a noob. Here you are dissing the idea without any evidence.

1 Like

@mike.yearwood, what multiplier and material did you use for this result?

I used the autocalc function and default settings. Thanks for asking the correct tech support question instead of treating me like I’m stupid

Oh man I directly got your point and functionallity you with for the printer.
This Forum should have a function that the theread autor can delete stupid answers.

It would be so great and more efficient when there would be an automated routine for materialpurge calculation. I could imagine doing the routine once before a print with materialswaps could save much filament over the whole print. and then a database that stores the values for you, perfect.

Maybe half automated with mobile app support should not be hard to program.
The first part of a defined sample is 100% Filament-Color-X the last part of the sample is 100% Filament-Color-Y
take photo with the app.
calculate after which distance Filament-Color-Y is like 100%.
Now attach it to your database as Purgevalue from X to Y.
Done

1 Like