Solution in Perl
To contact me
Send me an email to email@example.com
How to take a screenshot of a website?
To take a screenshot of a website, you need a webkit, such as the one used by web browsers to interpret and display a web site. The idea of the module WWW::Mechanize::Firefox; is to use a module installed on Firefox to interpret the HTML and get a screenshot.
You have to install first the module Mozrepl on Firefox.
use WWW::Mechanize::Firefox; my $mech = WWW::Mechanize::Firefox->new(); $mech->get('http://google.com'); my $png = $mech->content_as_png(); open(OUT,">","tt.png") or die $!; print OUT $png; close(OUT);
Save logs with a big buffer
There are two reasons why I do not like to use a simple redirect to save
the stdout/stderr: It keeps a permanent file descriptor open, which is not compatible
with a daemon running permanently.
The buffer sucks, if the daemon does not give much logs and print
them to stdout, it may take a very long time before you can get them
saved in a file.
I've written log.pl to correct this issue. It has a big buffer of 32k to improve performances, but it flushes it every 30 secondes to be sure I don't have to wait more than that to get a one line log.
my_daemon 2>&1 | log.pl logfile