You are not logged in.
Aloha! And thanks for the information!
I think you mean DLPrinter...
I'm curious too. The module you indicated was UVM-30A. I'm curious, how do you end up getting that module into the small tipped cone?
Nanodlp has now implemented an even better mask generation algorithm similar to that on buildyourownsla.com, but better in that you don't need Rhino or magic numbers or formulas to determine the mask, and any UV sensor in the cure range of the resin can be used without calibration due to the set point methodology.
I'm up and running. Thank you Shahin!
Ahhh! Nice find! That's a feature I didn't know about. Changing color to white on the profile fixes the issue.
Looks like black is the default when I add a new profile. I didn't realize I had to change it to white.
I sent the debug file to you in email. Find anything?
Removed mask
installed build 1222
rebooted rpi
created plate with ring3_plate.svg
nanodlp slices
still get error: Skipping public/plates/10/info.json file. stat public/plates/10/info.json: no such file or directory
result is black preview images
I don't have anything else to try, so hopefully you have a suggestion for how to get this up and running again?
No, everything is black with the mask or with it removed.
I used my PGM file and resized and converted to JPG (instead of PNG). I uploaded mask and checked it was there (looks great). I restarted rpi and loaded the SVG onto the plate. It slices. But, all the preview images are completely black.
Here's what I know...
Re-added mask and got this error on home page: Problem in recently uploaded plate file has been detected. Slicer module terminated. Please, remove uploaded plate and restart system to use slicer again. To investigate problem, please contact us. interface conversion: image.Image is *image.NRGBA, not *image.RGBA
Removed mask and then the slicer doesn't work. It just displays forever: "Plate #X being processed...". No pictures in Preview Layers.
Unplug and re-plug in
Slicer acts like it is working, but the Preview Layers are completely black.
Get error message: Skipping public/plates/5/info.json file. stat public/plates/5/info.json: no such file or directory
I also tried re-format and re-build the SDCARD, re-install nanodlp. This does not change the behavior. The only thing I can think of is that using the mask file caused a problem that I can't recover from.
I unplugged and re-plugged in raspberry pi and then it started slicing, but all slices are completely black. This is really weird. The home page gets this error: Skipping public/plates/5/info.json file. stat public/plates/5/info.json: no such file or directory
I'm completely stumped. I can't print at all until this is figured out. Or, is there a way to step back to prior build?
No mask shows up on the Setup...Projector Mask tab, so I am assuming there is no mask being used. Although, I sent you the mask PNG to see if that works for you or causes a problem.
debug file: nanodlp.debug.1924633070.zip
attachments won't let me upload 9MB SVG file
Removed the mask, deleted plate, and re-uploaded SVG onto new plate. Still says "Plate #1 is being processed...". Preview Layers button shows no layer images.
I am having problems with SVG files that worked before. Nanodlp will no longer process them. I use slic3r as well. I've upgraded to build 1221, but no luck.
At one point this message was displayed on the screen: "Problem in recently uploaded plate file has been detected. Slicer module terminated. Please, remove uploaded plate and restart system to use slicer again. To investigate problem, please contact us. interface conversion: image.Image is *image.NRGBA, not *image.RGBA"
Is nanodlp trying to slice the svg?
Depending on the size you need to print, you could have different masks that optimize the speed of printing. For example, you could have a mask that dims very little for a 3" diameter even intensity, one that dims for a 4" diameter, one for the whole projection area. The first mask would print very fast for small items, the second would print slower but would work for medium items, the third would print very large but take the longest. While you can do still do this, the setup time is longer due to re-uploading for each mask. Note: I haven't tried the internal slicer.
Wow you guys move fast! This is very cool. So is there a difference in over/under cure - can you tell? Also, in speaking with Dean we had a question on real-time updates - if the mask is toggled on or off after the original svg upload - does this change the print or does it need to be reloaded each time? Also- In thinking about the grey-scale masks this might be the reason for extending cure times as the grey scale (I think) is from fluttering the DMD - reducing the dosing each pixel throws to the plate. If this is the case, it might be best to change the initial brightness setting (if this is a different setting overall to even out the throw). I have heard that removing the lamp filter can help even this out as well which is what the experiments are for today...
Larry - can you send the link to your sensor? Mine is completely different from yours (hence the square base for the handle...) I really like the grid drilled pattern you did! Very simple, elegant and precise!
I think you mean DLprinter...
I agree DLprinter went very fast creating the templates. And DLprinter, I too would be interested in knowing the sensor you are using?
Shalin explained that he deletes the svg when the slices are processed to conserve space. If we didn't care about space then maybe he could keep them and auto apply the mask when chosen/created? (SD cards with lot's of space are very cheap). If users changes mask often then this may be useful, otherwise it means uploading svg's every time a mask is created/changed.
I would suggest the following for the help:
You need UV sensor or equivalent to use this feature.
How it works
Empty the vat of resin and clean the vat bottom surfaces inside and out
Specify number of measurement points in both x and y axis, and the size of measurement squares
Tabular cells are displayed on the user interface. You can change the cell values from 0=black to 255=white.
When measuring, hold the UV sensor in the middle of the measurement square projected on the inside the vat
Use the UV sensor to determine a set point value which is usually the lowest value of all the measurement squares (probably the corner values)
Decrease the cell value for each measurement square until the UV sensor reads the set point value (do not press enter unless you want to display the mask)
Press preview and double check the surface area to see if UV intensity is the same across the entire vat bottom.
Save mask (It takes up to a minute to generate a complete mask).
Goto Setup..Projector Mask to see the saved mask on the user interface (press refresh if necessary)
You can use "upload image" button on plates page to display any image.
I also had to remove the current mask
Could you display both masks and checkout the result with your uv sensor?
As far as I can tell, both masks perform within the measurement error of my sensor. I am averaging readings since the sensor will jump around. Perhaps someone that has a better sensor than mine could perform this experiment?
In any case, the software allows a method to gather the mask values and create the blocky mask and a method is provided in a prior post to smooth the mask with the resize algorithm, so both masks are possible at this point.
I suppose over time this mask capability will be refined. I'm sure the community will weigh in. Muve3D support made some nice comments about this capability.
Thanks so much for trusting my concept and then helping me implement this capability into nanodlp!!!
Elliot,
Re: bandpass - Looking at the curve for the sensor, the 405nm output will be about 20% of the output for 390nm, but there should still be an output. At least that is my hope.
Re: stl - Super! Thanks!
Re: brightness - I personally haven't played with brightness. I can conceptualize the bleedover situation you describe. I'm not quite sure how this maps to even UV intensity across the entire vat bottom? I think bleedover might be very local (millimeters) and lamp intensity is across the entire vat bottom (multiple cm)? The greyscale is being done by projecting the mask verses just a blank white rectangle. Any time a slice is projected it does a boolean AND of each pixel on the mask with each pixel on the slice. The mask is created by the method described in prior posts. I'm no expert at this stuff, so hopefully this is making sense...
Regards,
Larry
Could you display both masks and checkout the result with your uv sensor?
Is there an easy way to display the mask I upload on the DLP?
With this I tuned in the projector settings allowing increased blue light shift that was a stable curve between 390-450nm. (Because the UV sensor from spark is best up until about 390 (ML8511) - after which the sensitivity falls off rapidly). It's probably best to choose a smooth curve over this area since most resins used for the DLPs cure at 405nm (or are at least targeted for this wavelength). I'll go in and list those settings I found to be best for the curve in a separate message.
I've heard too that resin cures at 405nm and that the ML8511 falls off around 390nm. My plan was to purchase a bandpass filter at around 400nm that I would put over the ML8511 sensor (mine is on order).
I also made a mount for the UV sensor with measurement tabs on the sides to allow easier identification of where the sensor is being placed on the grid. Again, if you guys are interested I'll post this handle in stl in a separate post.
I was planning to make a mount for my sensor as well, I'd love to see your stl information!
Just to make sure I'm caught up with the journey thus far. We are all trying to normalize the intensity of the UV across the mask. The current method is to use the lowest value and reduce brightness on the other areas...I just wanted to ask, if we start with a brightness value of 20% could we not adjust up and down to make it easier? If the lamp operates on a curve for intensity, a beginning default of higher brightness (50+) to start off might be adding to this polynomial solution of ^5 power and thus the error.
Yes, we are trying to normalize the intensity of the UV across the floor of the resin vat (preferably at 405nm). I'm not sure I understand your question about brightness. I'm not sure if brightness is proportional to UV intensity. But, if it is, it seems like you would want the most brightness you could get 100% (so the resin will cure faster). Then wherever brightness (UV intensity) is hot, we damp that down to get an even brightness (UV intensity) across the floor of the resin vat. So, we could end up with 50% overall brightness (UV intensity) for example.
I think one of the cool things about the method Shahin has implemented, is that you don't care about the linearity of the sensor or the lamp. The user simply changes the greyscale value until the sensor reads the right value at every grid point.
Here is the mask I generated with Nanodlp:
So I decided to see what this would look like with the resize algorithm... I followed this process...
Create a PGM file by inserting the values you entered into Nanodlp (make sure to have .PGM as the file extension)
test27.pgm.txt
Convert the PGM to PNG (select color=gray):
Use this URL: http://image.online-convert.com/convert-to-png
The image is very tiny right below this text
Resize to 1920x1080 (select png file from above, section 4 uncheck "keep aspect ratio" & enter 1920 & 1080, section 6 select PNG, section 7 select Lossless Compression, click [resize image] then [download image])
Use this URL: http://resizeimage.net/
Are you sure we need more smoothing?
Actually, I'm not sure. This is a good question. I think it is all about resin cure. The more even, the less chance of over cure or under cure or "shelves" that may cause stress in the part. As a mitigation, I suppose additional cells can be added to the grid to make the result even smoother?
I think I'm going to ask an expert I know on resin cure to see what he thinks.
I also plan to experiment some more to see what my sensor sees when a full mask is completed and displayed.
I use this sensor I got off of eBay: ML8511 UVB UV Rays Sensor Breakout Test Module
Shahin, looks like for build 1218 the blocky mask is being stored when [save mask] is pressed and then displayed on Setup..Projector Mask
(On this test, I put value 200 in center cell and 220 on each side)