Tuesday, 13 May 2014

Ghetto Powershell fix: redux

Previously on ducksledge: We are forced to use a government-built program, and it's auto-update process regularly downloads broken builds. This makes using it a bit tricky.

We still have to use the software, it's now installed on every computer in the company, and it's still intermittently failing for weeks at a time. the script has changed a bit, so it's time for a repost.

And here's what it looks like:

Main Changes
  • Because the install base is bigger, it's easier to run it over the whole affected IP range than to work from a computer list. This also prevents the occasional hiccup where a computer got missed because the DNS cache was out of date.
  • It's a lot more verbose. It's now quick to tell at a glance where the script is up to, which is handy on the odd occasion it gets stuck.
  • After each loop it gives you some pretty yet fairly useless stats, because I like pretty things.

What would be required to make this a real fix:
  • Get rid of the main loop, and get it to run once through only. Trigger that ever 10 minutes or so from Task Manager. 
  • Set it to watch the affected program directory for new folders. If a new folder gets pulled down, it'll assume it's a fixed one.
  • Have it identify the directory it's copying dynamically, so there's now zero reason to actually edit the script. Just copy the working files to the right place, wait ten minutes, and it'll start rolling out.

Thursday, 30 January 2014

Downloading vidd.me: The fun of sequential file naming

Vidd.me launched recently, as a simple and effective way to share videos online. You upload your video file or gif, it converts it into mp4, and you get a link to the nice html 5 page it's displayed on. It seems to work rather nicely and the team are adding new features all the time without adding new limits.

Since it loads the raw mp4 file into the browser, getting the file is easy. Open up the source to any page and you have something like this:

And with our handy friend wget, you can download the full video from the commandline without any special tools at all.

So far it's business as usual. You could have used anything to download it, (including your browser. Things get a little more interesting when you look at the source of their 'new' or 'top videos' page:

There's three things to note here:

  • Each preview on the page has '-clip' added to the filename, to distinguish it from the main mp4 file
  • Every file seems to be stored on the same server using the same directory structure, d1wst0behutosd.cloudfront.net/videos/
  • The files appear to be named sequentially, with gaps for deleted or private files.
This presents a very simple way to find content. At the time I'm writing this they've got around 4200 videos, so let's pull down all the previews on their site in just a few lines:

for i in {1..4200}
                wget "https://d1wst0behutosd.cloudfront.net/videos/"$i"-clip.mp4"

A little while later.

If you feel like downloading everything on the site - full length - you just have to modify the script above ever-so-slightly, like so:

for i in {1..4200}
                wget "https://d1wst0behutosd.cloudfront.net/videos/"$i".mp4"

To take it further, It'd be pretty simple to set up a script that updates your mirror of the site: A cron job that grabs the 'latest videos' page, parses it for the highest video number, and then goes from the last downloaded video until that point (or just count up from the last known video until you hit too many failures in a row). As a side-effect of the one-way sync, you'd have a copy of any videos that were subsequently removed.

I have to point out that the last paragraph could violate the Terms of service, which has a few conditions against scraping and DDOSing you might fall foul of. While they don't seem to have yet, it's also likely they'll throttle heavy users

Instead of that, I'm more interested in picking and choosing based on what looks interesting from those previews I grabbed. Let's create a quick script that lets me specify an arbitrary number of videos (based on their number) and download them:

if [ "$#" -eq 0 ]; then
    echo "no input arguments detected"
    for arg in "${args[@]}"; do
        wget -nv "https://d1wst0behutosd.cloudfront.net/videos/"$arg".mp4"

It takes the video numbers as command-line arguments, so now I can do this:

 No need for browser extensions or custom software, you can now just grab any video you like with what you have installed.
As long as the privacy controls in place are solid and they put some throttling in place to stop people ripping the whole site every hour, there's nothing wrong with how they've set things up. The site is designed to be as open and accessible as possible, and right now they're doing that from the ground up.

Monday, 4 November 2013

Ambient Computing

That old 24" screen with the broken backlight may never be your main monitor again, but that doesn't mean you can't have a little fun with it:

Remove the backlight and frame, and you're left with a light and thin sheet that blends in with your surroundings, using available ambient light. When you're not using it, it just becomes a tinted window.

Sadly it didn't work quite as well on this laptop, due to the surface area of electronics behind the screen. It'd work much better on more recent models, but I don't have any I'm willing to sacrifice to the cause just yet.

Monday, 29 July 2013

Exifpeeler update: SSL

After months of letting it churn happily along, I finally spent 5 minutes making a fairly important change to exifpeeler: it now supports SSL. If you visit https://exifpeeler.com instead of just http://exifpeeler.com, your data should be secure in transit and not just at either end. 

I've left the http version enabled for now, but of course recommend you use the https version -While people probably aren't scrounging through all your web traffic looking for interesting tidbits, it never hurts to be careful. 
(if this is the kind of thing you worry about, may I also recommend the excellent HTTPS everywhere.)

Sunday, 30 December 2012

ExifPeeler: A web-based exif removal tool

I bought a cheap-and-cheerful VPS a year or two ago from the gentlemen at BuyVM.net. I've used it on and off for the many useful things that you can do with a VPS, but hadn't really tested it's limits. But the server is single core with an astounding 128mb of ram and 10gb hard drive, what can you really do with that?

Enter ExifPeeler. It does one thing and does it competently: You upload a batch of pictures, it strips out the EXIF data from them and gives them back to you nice and clean. You can then do anything you like with the pics without worrying about accidentally giving away your location or any other personal data.

Here's what it looks like.

And here's what it looks like after you upload files. Simple, right?

  • Batch upload. You can upload up to 100 files or 50mb at once, and download the results individually or as one big zip file. 
  • Unique URLs: Each pic has a unique and distinctive URL. Only you or anybody you link to your images will be able to see them.
  • Secure Timed Destruction: After an hour, your files are deleted. There are no options to keep them. Because I have such limited hard server space, I can't afford to leave your holiday snaps lying around for weeks. Let's call it a security feature.
  • Cross-Browser Support: Works great on firefox, chrome, and safari. IE9 and below support single-file upload only. I haven't tested with IE10, so let me know how it goes.
  • Mobile Device Support: It handles batch uploading from ios, android, and blackberry like a pro.   I haven't tested on win phone 7 or 8, so let me know if you try it. 
  • Duplicate Renaming: If you upload 5 pics called 'image.jpg' at once, they'll all get appropriately renamed.
  • Relatively Graceful Failure: if any individual pictures you upload fail - say, you accidentally upload a word document at the same time - it will still clean all the legitimate pictures and list any failures separately at the end.
The server I'm running it on pretty much dictated the rest of the choices. One PHP script that does all the admin (uploading, removing anything that isn't photos, creating thumbnails, etc), while it farms out all the hard work to exiftool. It double-checks to see that the exif data is definitely gone, and will refuse to even display the image if it's not successfully removed. Every 5 minutes a cron job runs and removes any files uploaded more than an hour ago. 

The key consideration is traffic. I've done some stress-testing, and it should be able to handle multiple file uploads per second. Under heavy load you might get a 5-second delay when uploading large batches (50+ files), but we'll have to see how it goes in real-world conditions.

Why an exif remover?
I really just wanted to see how easily it could be done. While my solution isn't particularly elegant, it works and is damn simple to implement. It consists of one php script, one third-party program, and one cron job. 

I also couldn't find another site that does the same thing. There are sites that will let you remove exif data from single files, there are sites that have exif removal as a bonus feature hidden away in menus, and there are of course hundreds of local exif-removal tools. I couldn't spot a web-based exif removal tool that lets you upload more than one file at once. This fills that niche.

Don't a lot of websites remove EXIF data anyway?
Yes, they do. Sites like blogger, facebook, and flickr will prevent other users from seeing some or all exif data from your pics. The data is still present, and the companies can still use it for what they like. They just don't provide it to viewers.

Most smaller sites don't seem to remove EXIF data. If you're uploading pics to a forum or emailing them to somebody, it's definitely good to remove the EXIF first.

If I shouldn't send files around without removing the EXIF, Doesn't that mean I shouldn't upload them to ExifPeeler?
Yes. If you're worried about it, strip out the data before you send it anywhere. The aforementioned exiftool is good and there are lots of other options. Exifpeeler is great for casual use but if you have information you genuinely need to keep private it's best it never goes on the internet at all. 

Where's the script so I can run it in my own project?
I'll probably put it up here soon, but in the meantime you can email me if you want more info. The script is nothing special, I just want to clean it up a bit and see how it works in the real world before I post it. 

How many times was EXIF mentioned on this page?

If you have the time try it out, and let me know if you manage to break it. If you do break it, please send a screenshot along with a brief description of what you were doing.

Monday, 17 September 2012

Dead Man's Switch on Linux, Part 1: Basic bash

I've always liked the idea of a dead man's switch. It's a partial fix for the 'what happens if I get hit by a bus' problem: how would you give your important account details to your family? How can you get a last minute message to your loved ones? Most importantly, who will delete your porn?

Traditionally, they work with the assistance of a third party. They've been used to launch nukes. Wikileaks used it to cause shenanigans. And you can sign up for some web-based solutions that will send pre-recorded emails for you when you show signs of being dead.

Of course, you have no control over that. the whole site might go down without you realising, rendering your efforts useless. So here's the same basic principle, implemented on your own software in bash script.

If you just want the script, skip to the end. It does things like this:

The Simplest Script Of All

We need three basic components to make this work:
  1. The main script file. When run, it checks to see if the timer has expired. If it has, it performs pre-set actions (like sending email)
  2. A method to 'reset' the timer. A one-line script file will work fine here.
  3. A way to trigger the first script to run. We'll use cron.
We also need a way to check whether we've hit the time limit. To keep things simple, I'm going to use the 'modified' time on the main script file. When it runs, it'll compare when it was last modified against the current time to act as the timer. To reset, you just need to run 'touch myscriptname.sh' to change the modified time. Easy, right? 

So here's the main script, set to do the most important task of all: Delete Your Porn.

#! /bin/bash
# Simple Dead Man's Script

timelimit=30 #Number of days to wait without update

let timelimit=$timelimit*60*60*24 #turn that into seconds
lastaccessed=$(stat -c %Y $BASH_SOURCE)
timenow=$(date +%s)
let timeleft=($timenow-$lastaccessed)
if [ $timeleft -gt $timelimit ]
    rm -rf $HOME/myporndirectory

Here's the 'reset' script, saved somewhere on your desktop:

#! /bin/bash

touch pathto/themainscript.sh

And here's the line you have to add to cron (crontab -e to get there):
0 0 * * * pathto/themainscript.sh

Simple, right? make the scripts executable, and if you ever go 30 days without running the reset script it'll delete all your porn. Careful with your holidays.
It can do anything that can be done from the command line. with minimal trouble it can ftp files around, modify/upload entire websites, start torrents, or just update twitter with what you had for lunch the other day. I want to use it to send an email.

The Slightly Less Simple Script

Obviously I took a simple concept and made it both unnecessarily complex and annoyingly basic. It now has a setup option that should work most of the time, and a 'test' option that emails yourself and makes sure it works.

Boring Notes:
  • The setup option assumes a generic install - that you have crontab access, that you have a home directory, etc. It will fail horribly if run as root.
  • The setup script creates a .dmswitch/ folder in your home dir. Inside will be the check script, the message file, and an attachment directory. The message file is plain text and as long as you need.
  • It also adds a reset_switch.sh file to your desktop. each time you run this, it resets the counter.
  • The setup is optional. If you manually move the files where you like and change the variables, it'll work just fine.
  • I'm using sendemail to do the email sending since it plays nicely with gmail. You can use any other method you like. If you use the script unmodified you'll need to install sendemail.
  • Any files in the attachment directory will be attached to the email. No spaces in filenames though, because I am lazy.

Quick Install:
On debian/ubuntu, it takes four lines and some variable editing.
  • sudo apt-get install sendemail libio-socket-ssl-perl
  • gedit  dmswitch.sh (or nano, vim, emacs, etc.)
  • paste in the script, and edit the email account details to your own. Save.
  • chmod +x dmswitch.sh
  • ./dmswitch.sh setup
Test it by running .dmswitch/check_switch.sh test . If the email settings are correct, you should have a test email arrive shortly.

  • rm -rf .dmswitch/
  • crontab -e, remove the entry pointing to the script.
Finally, The Script itself.
# Basic Dead Man's Switch v1.0
# Options:
# 1) dmswitch setup
#    sets up the script in your home dir with some default settings.
#    best to set it up manually or automatically rather than a mix of the two.
# 2) dmswitch test
#    sends a test email to the itself.
# 3) dmswitch reset
#    'checks in' with the dead man switch and resets the counter to zero
#    Just does the same thing as touching the scriptfile.
# 4) dmswitch
#    default. if the time has expired, IT WILL SEND AN EMAIL.

#some running variables are based on the setup vars
croncommand="0 0 * * * "$setupdir"/"$checkscript #cron line for how often it checks expiry. Default is daily.

#make sure you change the email settings. 
dmdir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
timelimit=30 #Number of days to wait without update
emailto="someemail@gmail.com" #The target address
emailfrom="myemail@gmail.com" #The account you send with
emailusername="myemail@gmail.com" #The username for your email account
emailpass="myemailpassword" #password
emailsubject="Automated email from Dead Man's Switch."
message=$dmdir"/"$setupmessage # the text file you want your message stored in
attachmentdir=$dmdir"/"$setupattachmentdir # put any attachments you want to include here

#checks if the time has run out. Does the maths in unix time.
function checkifexpired() {
    let timelimit=$timelimit*60*60*24 
    lastaccessed=$(stat -c %Y $BASH_SOURCE)
    timenow=$(date +%s)
    let timeleft=($timenow-$lastaccessed)
    if [ $timeleft -gt $timelimit ]
    then expired=1

#sends an email. replace sendemail with whatever program you prefer.
function sendemail() {
    attachmentlist=$(ls $attachmentdir)
    cd $attachmentdir
    sendEmail -f $emailfrom -t $emailto -u $emailsubject -s $emailserver":"$emailport -xu $emailusername -xp $emailpass -a $attachmentlist -o message-file=$message

#sets up a directory to run from and creates the necessary files.
function setupdm() {
    mkdir -p $setupdir
    cp $BASH_SOURCE $setupdir"/"$checkscript 
    cd $setupdir
    chmod +x $checkscript
    mkdir $setupattachmentdir
    touch $setupmessage
    echo "If you can read this, I'm dead or arrested or something" >>$setupmessage

    #append cron job to existing cron file
    (crontab -l; echo "$croncommand" ) | crontab
    #setup reset script on desktop
 touch $resetscript
    echo "#! /bin/bash" >>$resetscript
    echo "checkfile="$setupdir"/"$checkscript >>$resetscript
    echo "touch $""checkfile" >>$resetscript
    chmod +x $resetscript
    echo "setup complete"

#main script starts here


if [ "$1" == "setup" ]
elif [ "$1" == "test" ]
    #send test email to yourself
    emailsubject="TEST: "$emailsubject
elif [ "$1" == "reset" ]
elif [ $expired -eq 1 ]
    #send the email, and disable the script from running
    chmod -x $BASH_SOURCE
else exit

Go wild.
Part two will cover the fact that not everybody leaves their desktop on for months at a time and suggest several overly-elaborate ways to trigger the reset while running it on a remote machine.