QUOTE(Fuchur @ Nov 21 2007, 06:17 AM)
If your image has a resolution of 1024 px x 768 px with 72dpi you can put in a higher dpi-count (for example 300dpi) and will gain a larger resolution (8533 px x 6400 px).
You will only get a larger file but fundamentally, if the file was rendered at 1024 x 768 pixels, that is the total picture information you will get. If you enlarge the image by increasing the dpi, you will not add new picture information. You will only get the same picture information with larger and blurrier pixels.
I think this has to do with the displays. To work on a "300 dpi"-image on a screen which can do 72 dpi you have to convert dpi somehow. So to get the same informations on the 72dpi-display you have to increase resolution.
So in the end it isn't "wrong" it is just not the right word...
This is just technical issues. THis is not relevant and only adds confusion to the real issue. The real issue is actually much, much simpler than that.
One way to get rid of the problem would be to give a second output option in A:M => dpi... but I think you dont have to programm that.
Therre are no problems there. There are only confusions.
I think both Martin and Will recap the issue very well: "if it looks good then it is good" but I see that it is still not understood so let me try to add some explanations:First -
The quality or precision of an image does not depends on the output DPI of any device and have nothing to do with the device being a printer or a display. It only have to do with the resolution in pixels of the image and the distance the viewer will look at it. I repeat: The only important factors are the pixels resolution and the distance of the viewer from the image. Don't even think in term of DPI because it is totally irrelevant.
The actual important criterion for deciding on an image resolution is the resolution the human eye will see. This is the only one criterion to consider. And this is why "if it looks good, then it is good".
If your image is designed to be, say, 5 inch wide when printed, then display it on your screen so it is about 5 inch wide and look at it from a normal reading distance. With modern high resolution screen, if it looks good that way, then it is good for printing. But if you really want to be sure, then display the image at twice its printed size and if it still looks good that way, then it is already more than good for printing.Second -
DPI are only a byproducts of scaling an image to a specified printed size. Take your example 1024 x 768 pixels image. If you wanted to print it so it is one inch wide, then the printer would need to print 1024 pixel into 1 inch wide so the image would have 1024 DPI. DPI means Dot Per Inch. That is all.Third -
Printers deal with LPI anyway. not DPI. They are not the same thing at all. An actual digital printing device may have 4000 DPI, it does not change the fact the the actual printing will be specified in LPI. A digital printing device only uses its DPI resolution to get a more precise LPI resolution. But DPI on a printing device are binary DPIs (that is black or white) while a DPI on a screen are 256 shade DPI. So they even though they are both DPI, they fundamentally don't mean the same thing.Fourth -
A printer can print any picture with any resolution at any specified print dimensions on the paper. DPI is totally irrelevant. Just supply the image to the printer and specify the final print dimensions in inches and he will do the rest. You should not have to bother with that aspect of the printer's business. That is his job. Not yours. If the printer insist that you have to figure that sort of technical issues for him then look for another printer.Fifth -
Good antialiasing does wonders. You can get away with much lower resolutions if you have good antialiasing on your image.