ClamAV was doing it’s job scanning email via amavis-new. It was catching all the nasties that folks tend to foist on their fellow net citizens. Unfortunately, when your spam and virus filters are doing their job, they occasionally catch folks who aren’t malicious, but also aren’t using best practices. This was the case with American Express. Emails to my clients from @welcome.aexp.com were being classified as “Heuristics.Phishing.Email.SpoofedDomain”. Searching around the net brought me to several sites where admins had ended up doing crazy things like disabling the heuristic scanning on email in ClamAV, or creating elaborate policy banks in amavis. Well, I was having none of that. I like the most correct, simplest solution. Hopefully, this methodology will help others solve similar issues.
One post I read referenced this document that relates how to create whitelists for Clam’s phishing filters. That’s a good start. That same document mentions a utility script called “why.py” that will help isolate why an email is getting picked up by a rule. Unfortunately, my install didn’t have that script. A little searching brought me to a copy on GitHub. Running that led to a laundry list of python specific issues, mostly due to my environment. But, using the script as a guide, I just did it manually. The following command gave me a goldmine of information.
A few items to note about the command:
- The path after the -d is the location of my AV signatures.
- The amex_mail.eml is the raw text of the email (headers and all) that I pulled out of our quarantine database.
In that giant slew of output from the clamscan debug, the important part was this:
LibClamAV debug: Phishing: looking up in whitelist: .www.youtube.com:.www208.americanexpress.com; host-only:1
LibClamAV debug: Looking up in regex_list: www.youtube.com:www208.americanexpress.com/
LibClamAV debug: Lookup result: not in regex list
LibClamAV debug: Phishcheck: Phishing scan result: URLs are way too different
LibClamAV debug: found Possibly Unwanted: Heuristics.Phishing.Email.SpoofedDomain
The issue is that there are links that display one URL, but link to a different URL altogether. I added the first pair to a file called daily.wdb in the same directory as my other ClamAV signatures. (/var/lib/clamav/ in my case.) With each pair that I added, I would re-run the debug command and discover a new pair. I ended up with three pairs in there before the emails checked out clean. Below is the contents of the daily.wdb file.
Here is a more advanced example of a daily.wdb file.
Once I restarted clamd, the AmEx emails started to pass as expected. Hope this helps someone.
I like having Pandora going pretty much all the time, be it Bach when I’m coding, Techno for sysadmin tasks, or indulging my shameful pop music addiction. I wanted a way to control Pandora without having to drop out of the shell. I wanted it for my Mac, but lucked out and found one that works across all the platforms I use. Pianobar is a command line Pandora client and it works in Mac and Linux. (And Windows too.)
I was having a little trouble building it in Snow Leopard using the instructions from here, when I discovered that it’s already available in MacPorts. So I installed it with:
In Linux, you can find links to the repos for your distro of choice on the Pianobar website.
Next, I wanted it to login automatically and start playing when I launched it. On Mac, you can create a config file at ~/.config/pianobar/config, with contents similar to the following:
password = s3cR3t_sQu1RR3L
user = firstname.lastname@example.org
To get the station ID for the autostart_station parameter:
- Run pianobar
- Log in manually
- Launch your favorite station
- Hit i to see the station and song info.
- The station ID will be in parentheses after the station name.
After you’ve got your file saved, you should be able to launch pianobar and have it start playing auto-magically.
Now, my next step was to use at so I could start pianobar at a given time and use it as an alarm clock.
You need to enable atrun on your Mac to use at to schedule jobs. (It’s enabled by default on most Linux distros.) You can schedule the launch like so:
pianobar #hit enter
If you start pianobar with at, it’s not on an interactive shell so you have no way to interact with it, or so I thought. You can create a fifo file to pass controls to the process:
Once you have that, you can control pianobar by echoing commands into the fifo:
echo p > ~/.config/pianobar/ctl
echo q > ~/.config/pianobar/ctl
Hopefully that’s food for though enough to get you started. Enjoy.
For over a year now, we have been using a samba shared network printer to generate TIFF files from electronic documents so that they can be imported directly in to iPerms. The TIFF printer is simply a script that takes PostScript input from the client machine’s print driver and converts it to an iPerms compatible TIFF image. This is primarily useful for an Army installation, but may be relevant if your site is using some other form of document archiving system that uses TIFF images. I can say that ours has spit out over 30,000 pages.
How it works:
Users setup the printer on their system using a PS print driver. (On Vista, I usually use the HP Color Laserjet 2500 PS driver) When they print a document to the printer, it generate a PDF, converts the PDF to a TIFF file and removes the interim PDF. The completed file is dropped into a share with 0600 file permissions. I use 0600 because I set the share to only display readable files. Thus, while everyone is printing to the same folder, they only see their files when they open the share. Less chance of PII leakage.
First, the goods: http://jeffgeiger.com/cron/ (Source available for download from that page.)
This little app parses your crontab and generates a gantt chart of the task time line. It uses the twzCronChart class. I basically took their example and modified it to:
- Accept input via an html textarea.
- Parse out the comments and blank lines.
- Show a corresponding list of which cron line corresponds to which “Task”.
I’ll update this post if I make more changes.
Over the past few months, one of my “As time allows” projects has been a movie collection cataloguing script. Basically it reads through your collection of digitally stored movies, catalogues them in mySQL, and gives you a PHP interface to your collection. It’s not fancy, it’s not pretty, and it almost certainly doesn’t adhere to any good coding practices.
I will try to keep this updated as I “improve” the script and pages. Also, if by some twist of fate, you’re a skilled bash scripter with some free time, I’m open to suggestions on how this can be improved.