Printing a PDF from evince results in garbled output

I am trying to print a PDF and upon printing, I get garbled output. I cannot print in chrome, so instead, I'm printing in evince which does allow me to print to the printer, but produces the garbled output. Chrome is complaining the printer is not properly installed or configured and won't let me print to it. I haven't changed anything, nor do I see errors in the cups log file.
 
Is this with all pdfs or just one file? I usually print pdfs by using pdf2ps to convert into a ps file, then just send it to the printer with netcat,
Code:
 pdf2ps < myfile.pdf > myfile.ps 
nc <printer IP> 9100 <  myfile.ps
Actually if one just does pdf2ps <myfile.pdf> it will produce myfile.ps, you don't need the redirection.

This isn't answering your actual question, it's just offered as an alternative possibility.
 
Good catch, it is with that particular PDF. I printed another one just fine. I'm curious as to why that happens and if there is a way to fix it.
 
Good catch, it is with that particular PDF. I printed another one just fine. I'm curious as to why that happens and if there is a way to fix it.
I had that once, many years ago. Client reported a failed printer. It took a few days to identity that it was a specific PDF document causing the printer to freeze/lock up as a reset cleared it and it then worked fine until the user tried to print that specific file again. Eventually, I tracked down a report on the HP website acknowledging the fault and it being marked "will not fix". The workaround was to open the PDF in a different app and re-save it. I've forgotten the full details, but it was something quite rare and only when producing a PDF file from a particular PDF creator. No idea if it was the HP PDF renderer failing on some rare operation or simply not failing gracefully when it got bad code. A bit like UK air traffic control :-)

So, try editing and saving the PDF file, or convert to something else and back again, ie pretty much anything that will change the file and hopefully produce a "correct" version
 
There are several pdf -> ps interpreters.
graphics/poppler https://poppler.freedesktop.org/ Command pdftops. Evince uses poppler.
print/ghostscript https://www.ghostscript.com/ Command pdf2ps

I have good results w/ graphics/zathura-pdf-mupdf (ghostscript based) but I do use lpr printing. It is keyboard driven and fast.
With the pdf open in zathura, command
Code:
:print
opens a print dialog.

There is an old 2013 Arch linux post that zathura-pdf-mupdf did not work well w/ cups and that graphics/zathura-pdf-poppler worked better. I suspect that ghostscript/mupdf may have improved rendering since 2013.
 
Last edited:
This is still an issue for me, I haven't resolved it. I have more PDFs that have this issue.

Yes, if I convert to postscript, it works fine, but I believe this was working before. I wonder what changed?
 
I changed the title to match the problem better. If I covert the pdf to postscript using the above command, actually, it is:

pdf2ps input.pdf output.ps

nc <printer IP> 9100 < output.ps

The file prints just fine. Therefore, something wonky is going on with evince, what it is I haven't a clue, but I was fairly certain evince worked well for me before.
 
This still doesn't *necessarily* point at evince as the *real* problem. It's probably still the PDF itself that's messed up (technical term). pdf2ps might simply ignore the wonky bits that evince does not when producing the ps output.

I've certainly had a number of PDF files that have been, shall we say, less than smartly assembled. One file had a small 2 inch/4cm drawing on one page that, when rendered, was little more than a shaded outline, but it was drawn with thousands-upon-thousands of lines. This blew up some PDF viewers and light-weight printers that just couldn't deal with the sheer number of elements. Looking at the details I'm pretty sure then vendor just embeded output directly from autocad even though nearly none of it was actually viewable in the final document (there were other similar high-element count images completely covered by, for example, a block of text with a solid white background). PDF files are a great way to produce a stream of explitives.
 
Hmm, how can I verify that? I have been seemingly having this issue with any PDF I try to print through evince now. And, while it is certainly possible the PDF has junk in it, I would still expect some to work.
 
It would help to have more information. You are using CUPS. What sort of printer do you have? How is it configured? Does it print PDF native?
 
If it's *any* pdf, mostly ignore what I was ranting about.

But... the next question is, where are these PDFs coming from? The same source? Multiple different places? (really different, as in completely different businesses/websites). If they're all from different places, then your earlier conclusion that something's wrong with evince is much more likely. But, maybe not a problem with evince it self, but the installation. I've never had any real problems with evince in the past, so if it started messing up everything I'd look for mangled configuration within evince itself.

If it's multiple different PDFs from the same/similar source, the problem may still be in the PDFs themselves. You could, if the PDF doesn't contain sensitive information, use some on-line PDF validation site to check basic conformity.
 
These PDFs are from different websites. It is a Xerox WorkCentre 3615. I don't recall if I printed via jetdirect or regular printer setting. I checked it with Adobe:

Result:
The checked PDF/A profile was "PDF/A-1B"


Details
ISO 19005-1:2005

6.2.3
DeviceRGB may be used only if the file has a PDF/A-1 OutputIntent that uses an RGB colour space

ISO 19005-1:2005
6.5.3
For all annotation dictionaries containing an AP key, the appearance dictionary that it defines as its value shall contain only the N key. If an annotation dictionary's Subtype key has a value of Widget and its FT key has a value of Btn, the value of the N key shall be an appearance subdictionary; otherwise the value of the N key shall be an appearance stream.

The document renders fine in evince, it just doesn't print properly. Anyplace there is a character, I get a box or a box with a question mark inside of it.
 
Ok. That's a font issue, clearly :-) Possibly a language/code page/UTF issue.
Are the problematic PDFs in a different language than the printer's default? ...if the pdf doesn't have an embedded font, but your computer has a font with a wide glyph set, then evince may render it on screen correctly, but when sending it to the printer just sends the pdf itself. If the printer doesn't have an appropriate font then the normal box/? will render for missing glyphs. GhostScript may embed fonts in the converted postscript conversions explaining why that works.

I'm just guessing/cogitating-out-loud here.
 
I compile CUPS with all the knobs turned on... even for dependencies that offer any kind of support for PDFs. If a knob suggests PDF support, I turn it on, even if I don't know that I really need it. That way, I have options - if one method of PDF rendering/printing is not reliable, there's others to try.

Works great in practice. On a rare occasion, some Chinese characters don't print/display properly, and instead show question marks (although, even that was getting better recently). And I can tell you, the printer as a device doesn't care if you send it a Chinese character or a dozen of 'em on a PDF page. The real issues are with encoding and filtering on your computer. Run Print Preview before sending anything to the printer.
 
Print preview will still be running on the computer, not the printer. If a character ( code point) is rendering on your computer well because the font it's using has that character's glyph in it, it'll look good on preview, but if the font used on the printer itself doesn't have the glyph then....you'll still get boxes/question marks for any characters outside the embedded font's range.

PDFs *can* embed fonts to get around this very problem, but it's not required, and increases the size of the PDFs (not unreasonably, usually, as only the glyphs used get embedded). Sending rendered bitmaps to the printer also gets around this issue. I'm guessing ghostscript does this (font embedding) and evince does not. I don't see any options for this in my version of evince. This also probably isn't a CUPS issue, either as font embedding is more the responsibility of the source application.
 
Print preview will still be running on the computer, not the printer. If a character ( code point) is rendering on your computer well because the font it's using has that character's glyph in it, it'll look good on preview, but if the font used on the printer itself doesn't have the glyph then....you'll still get boxes/question marks for any characters outside the embedded font's range.

PDFs *can* embed fonts to get around this very problem, but it's not required, and increases the size of the PDFs (not unreasonably, usually, as only the glyphs used get embedded). Sending rendered bitmaps to the printer also gets around this issue. I'm guessing ghostscript does this (font embedding) and evince does not. I don't see any options for this in my version of evince. This also probably isn't a CUPS issue, either as font embedding is more the responsibility of the source application.
That's the whole point of print preview - what you see in the preview is what gets printed. If you see that your print preview is a mess, you can just decide to not use paper. That works under CUPS, too. Just compile ghostscript, evince, whatever with all the PDF support available in the knobs.
 
Completely untested by me, but I do suspect that if there are Asian characters in a pdf, that, if (as happened to a friend of mine with an Epson) sending the pdf to the printer with netcat doesn't result in a good document (my friend got pages of gibberish), using pdf2ps, then sending the document to the printer via netcat might work. astyle, as my wife sometimes has to print Japanese documents, I thank you for your suggestions, which I'll use if it ever doesn't work. (She usually prints it from her Mac to a Brother MFC, so we haven't had such a problem, but it's a good thing to know.
 
That's the whole point of print preview - what you see in the preview is what gets printed. If you see that your print preview is a mess, you can just decide to not use paper. That works under CUPS, too. Just compile ghostscript, evince, whatever with all the PDF support available in the knobs.
The print preview is what the computer/application *thinks* is going to be printed. The preview is generated by the computer, *not* the printer. It's the computer's/application's best guess as to what is going to be printed but not *actually* what is. What the printer does may differ. To again harp on one point, if the computer sends ascii/utf text but does *not* embed the font, then the printer will pick it's own font that it thinks is closest and try that. This may be exact, close, or totally wrong depending on the fonts installed in the printer with respect to the fonts referenced in the document. If the application embeds the font, or pre-renders bitmaps for the characters, you'll get something much closer to the predicted output.
 
The print preview is what the computer/application *thinks* is going to be printed. The preview is generated by the computer, *not* the printer. It's the computer's/application's best guess as to what is going to be printed but not *actually* what is. What the printer does may differ. To again harp on one point, if the computer sends ascii/utf text but does *not* embed the font, then the printer will pick it's own font that it thinks is closest and try that. This may be exact, close, or totally wrong depending on the fonts installed in the printer with respect to the fonts referenced in the document. If the application embeds the font, or pre-renders bitmaps for the characters, you'll get something much closer to the predicted output.
Not quite how the printers work any more...

Also, these days, even a messed-up PDF will render and print OK, just use up-to-date software. Official Adobe products might complain, but that's it.

A thread I made may be of interest to you: Thread airprint-seems-to-be-the-new-standard-for-printing.94865
 
Not quite how the printers work any more...

Also, these days, even a messed-up PDF will render and print OK, just use up-to-date software. Official Adobe products might complain, but that's it.

A thread I made may be of interest to you: Thread airprint-seems-to-be-the-new-standard-for-printing.94865
Printers still totally work this way. The "Driver less" thing isn't driver-less at all, but more one common driver for a base level of document/formats that all printers support. Most/many modern printers support PDF directly, which is, itself, a variant of postscript. Sending a PDF *without* embeded fonts will still be a problem if the fonts installed on the printer itself don't include the necessary glyphs. The full UTF character set includes, what, several hundred thousand characters now? No printer is going to support all of those, though most western, chinese and japanese characters isn't too much to ask from even a low-end printer now-a-days. Now, embedding fonts *should* be at least a defacto standard, but isn't necessarily (so far as I know).

OK I'm *at least* (:-) partially wrong, here. Most printers *do* support PDF and everything I said about direct PDF printing is still true. BUT, the IPP spec *also* includes a pwg rasterized format for printing that would be rendered on the computer, not the printer. So, anything using the pwg raster format for printing would, truely, be able to display an *actual* print preview that would, in fact, be the actual output (as you said). This does require using the pwg raster format for the actual print, and that may not be what evince is doing.
 
This does require using the pwg raster format for the actual print, and that may not be what evince is doing
And that's why I suggested to recompile ghostscript/evince/cups/whatever with all the Makefile knobs turned on. Pre-compiled packages tend to have useful options turned off. There are reasons for that, as well as downsides for making such choices. I compile my ports with EVERYTHING turned on to the extent practical, because I do disagree with the defaults picked for pre-compiled stuff.
 
Back
Top