Untappd – A Rant

I’ve held off on this, thinking maybe they would fix the issues.  However, the app has gone to shit.  It’s SOOOOO fucking slow to lookup, checkin, comment, etc it’s insane.  I was trying to checkin beers tonight and finally gave up.  Waiting MINUTES yielded nothing.  The app took 60 seconds plus to load.  This has gotten insane.  It’s down all the time, hardly works when it’s up, and is SOOO fucking laggy it’s hard to use even when it does work.  This needs fixed guys, like now.  I know you want to add sponsors and get money, I get that, but when the app doesn’t work for shit I can’t even use it!  So, fuck this app, and fuck you all.  Fix this shit!!!!

Open letter to Comcast and the FCC

Hello,

With the recent Comcast rollout of nationwide 1TB data caps I feel it is crucial that I submit my complaint. Data caps are a problem and without swift action will be a much larger issue in the near future limiting not only innovative uses of the Internet but the entire global flow of information.

Data caps are not only an inconvenience to those customers but are simply against net-neutrality at its core. This is nothing more than a money grab and attempt to get people to stick with the dying cable TV model. It even excludes Comcast related data from that cap!!

It unnecessarily impedes emerging video technologies such as 4k streaming while simultaneously punishing those that dare to download large games or files. It’s not even rooted in network congestion: https://www.techdirt.com/articles/20130118/17425221736/cable-industry-finally-admits-that-data-caps-have-nothing-to-do-with-congestion.shtml

For example: streaming 4k content according to Netflix uses roughly 4.7GB per hour. Doing that math that’s ~7 hours PER DAY before hitting said 1TB cap. Multiply that by 2 or 3 other members of a household and you can watch at most a few hours per day. This is assuming you do absolutely NOTHING else with that Internet connection.

The further encroachment of data caps sets a dangerous precedent that unchecked will stifle innovation and let ISPs control the flow of information into households. This is something that needs to be curbed quickly to prevent ISPs restricting the flow of information simply to benefit themselves.

I urge you to please consider restrictions or outright banning of data caps on hardline Internet connections such as cable and DSL. I further ask that you investigate data caps on cell data to determine if there is actually any legit reason they exist. T-Mobile is a great example. They allow very specific traffic to NOT count towards a data cap. This is also against net-neutrality.

Full disclosure: I’m not a Comcast customer, I am with Time Warner (now Spectrum). I do not have a data cap but average right around 2TB per month with what I consider normal usage, at least for the next generation. Data usage is only going to increase and at a rapid pace as new technologies emerge.

I’ll say it a second time, please consider a ban on data caps. This is nothing but the stifling of innovation, holding onto an archaic business model, and lining of pockets of ISP executives. What we ACTUALLY need is more innovation, more competition, and a stronger Internet presence as a country.

Signed,
Dan Bunyard

Mozilla Firefox Performance in 2016

I’ll keep this one short.  WTF is Mozilla doing with Firefox?  How can the performance POSSIBLY be so crappy on a machine with 16GB of RAM, an SSD, and a discrete video card?  20 second lag between switching tabs, 20 second lag even to minimize.  Startup with ~8 tabs takes MINUTES at best.  I want to like Firefox, I really do, but every other browser just destroys it in performance and hell, is ACTUALLY USABLE, that I just cannot default to Firefox.  I’d love to know what the devs are thinking…..

SmugMug Uploads through Astaro/Sophos UTM Hanging

Since I found exactly zero information on the topic I thought I would throw together a quick blog post in the event other’s run into trouble with this.  It’s been a few weeks since I last uploaded pictures so I’m not sure what changed, might be the new “beta” firmware I’m running but I’m not convinced of that.

Anyway the problem is when you attempt to upload files to a SmugMug gallery it will “hang” with the bars all the way filled.  They will also fill very quickly (much faster than your upload speed) since the Sophos box is intercepting them.  If you start digging into the logs on the Sophos box you will see something like the following:

2014:11:30-19:00:31 gateway httpproxy[5489]: id=”0002″ severity=”info” sys=”SecureWeb” sub=”http” name=”web request blocked” action=”block” method=”PUT” srcip=”192.168.9.9″ dstip=”54.85.78.68″ user=”” ad_domain=”” statuscode=”504″ cached=”0″ profile=”REF_HttProContaInterNetwo2 (Everything Else)” filteraction=”REF_DefaultHTTPCFFAction (Default content filter action)” size=”2647″ request=”0xad9b1000″ url=”http://upload.smugmug.com/photos/xmlrawadd.mg” referer=”http://www.bunyardpics.com/Family/Bunyard-Thanksgiving-20141129/” error=”Connection to server timed out” authtime=”0″ dnstime=”513″ cattime=”53″ avscantime=”5640″ fullreqtime=”61679447″ device=”0″ auth=”0″ ua=”Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.122 Safari/537.36″ exceptions=”” reputation=”neutral” category=”179″ reputation=”neutral” categoryname=”Media Sharing”

or

2014:11:30-19:03:49 gateway httpproxy[5489]: id=”0002″ severity=”info” sys=”SecureWeb” sub=”http” name=”web request blocked” action=”block” method=”PUT” srcip=”192.168.9.9″ dstip=”54.85.152.234″ user=”” ad_domain=”” statuscode=”504″ cached=”0″ profile=”REF_HttProContaInterNetwo2 (Everything Else)” filteraction=”REF_DefaultHTTPCFFAction (Default content filter action)” size=”0″ request=”0xc8654000″ url=”http://upload.smugmug.com/photos/xmlrawadd.mg” referer=”http://www.bunyardpics.com/Family/Bunyard-Thanksgiving-20141129/” error=”Connection to server timed out” authtime=”0″ dnstime=”36673″ cattime=”120″ avscantime=”5954″ fullreqtime=”62055195″ device=”0″ auth=”0″ ua=”Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.122 Safari/537.36″ exceptions=””

or

2014:11:30-19:14:37 gateway httpproxy[5489]: id=”0002″ severity=”info” sys=”SecureWeb” sub=”http” name=”web request blocked” action=”block” method=”PUT” srcip=”192.168.9.9″ dstip=”54.85.78.68″ user=”” ad_domain=”” statuscode=”504″ cached=”0″ profile=”REF_HttProContaInterNetwo2 (Everything Else)” filteraction=”REF_DefaultHTTPCFFAction (Default content filter action)” size=”2647″ request=”0xa978800″ url=”http://upload.smugmug.com/photos/xmlrawadd.mg” referer=”http://www.bunyardpics.https://www.danodemano.com/wp-admin/post.php?post=491&action=edit&message=10com/Family/Bunyard-Thanksgiving-20141129/” error=”Connection to server timed out” authtime=”0″ dnstime=”114″ cattime=”0″ avscantime=”0″ fullreqtime=”62108802″ device=”0″ auth=”0″ ua=”Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.122 Safari/537.36″ exceptions=”av,auth,content,url,ssl,certcheck,certdate,mim

depending on if you have tried to create bypass rules to solve this issue.  If you follow the progressing this first is prior to any rules, the next is with a bypass domain in the web filter profile, and the third is with a bypass in the core exceptions list.  In all three cases you will notice the 504 connection timed out errors.  This is, I think though can’t prove, related to the Sophos box trying to scan the picture upload. 

So enough of the preamble, how does one fix this?  It’s actually pretty simple, you need to entirely bypass the upload.smugmug.com domain group.  You can do so with the following steps:

  • Log into your Sophos UTM web UI
  • Navigate to Web Protection -> Filtering Options -> Misc
  • Under “Transparent mode skiplist” area click the plus (+) next to “Skip transparent mode source hosts/nets”
  • In the “Add network definition” box create a definition with the following:
    • Name: Smugmug Uploads (Or whatever name you want to identify this definition)
    • Type: DNS group
    • Hostname: upload.smugmug.com
    • Comment: [blank] (unless you want to make a more detailed note about this definition)
  • Click save then click the folder icon next to “Skip transparent mode destination hosts/nets”
  • In the list that populates on the left choose your “Smugmug Uploads” and drag it into the box under “Skip transparent mode destination hosts/nets”
  • Click “Apply” at the bottom of the “Transparent mode skiplist” area

You should now be able to upload pictures without issue!  Please leave a comment if you have questions/problems.

Migrating websites between Virtual Private Servers

So I recently found myself needing to move websites from one VPS to another.   In this case it was the same provider but different datacenters within the US.  While I was contemplating options I discovered a few rather simple tricks and thought I would share them here.  My research didn’t turn up any complete post on migrations (though a few came close) and I wanted a somewhat comprehensive guide.  As always this advice is given AS-IS and I cannot be held responsible if you destroy your data, bork your servers, or get your account suspended by your host.

Anyway let’s get started.  This post assumes you are moving between similar platforms and architectures.  I’m going from CentOS 6 to CentOS 6 both are x64 platforms.  If you are trying to migrate between dissimilar platforms (Debian to CentOS) or architectures (x86 to x64) there is a good chance these instructions will not work.  I’m also assuming you have installed all the correct packages on your new server.  You can do a “yum list installed” in your source server to see what all you have installed.

The first step is an initial sync of your files.  Depending on the size of your web root and the server’s Internet connections this might take a while.  I suggest you start this before bed then check it in the morning.  There is no need to worry if it fails as we will be running this  multiple times.  We are going to use rsync for the entire file sync process.  The command you need to run is:

rsync -avzhe ‘ssh -p 22222’ root@OLD.HOST:/var/www/ /var/www/

This command needs run from the DESTINATION SERVER.  If you are looking into the specifics of the rsync command please see here.  The only thing I will note is the number after the -p is the SSH port number.  22 is the default.

We can run this command as many times as needed as it will only bring across changed files or those that didn’t exist the first time.  If the initial sync fails simply run it again.  Now we need to move over any config files, this is easiest to do with rsync as well – again run from the destination server:

mv /etc/httpd /etc/httpd.old
rsync -avzhe ‘ssh -p 22222’ root@OLD.HOST:/etc/httpd/ /etc/httpd/

The first command moves the existing httpd directory to a backup location while the second brings over all the configs from the old server.  There are a handful of directories you will want to run this on including, but no limited to:
/etc/httpd/
/etc/postfix/
/etc/php.d/
/etc/mail/

Once you have done this it’s time to pick the migration time and run a few more commands.  I choose early on a Sunday for this final migration.  Shut down the Apache service on the source server.  Time to migrate the MySQL databases.  This command works nicely but need run from the SOURCE SERVER:

mysqldump -u root -pPASSWORD –all-databases | ssh -p 2222 root@NEW.HOST ‘cat – | mysql -u root -pPASSWORD’

Of the PASSWORD variables listed there the first is the MySQL root password of the source machine, the second is the MySQL root password of the destination machine.  You will be prompted for the main root password of the destination machine when you execute this command.   The number after the -p, as before, is the SSH port number.   As with the rsync this command may take a while.  Now time to run the final sync on the web root.  Same command as before:

rsync -avzhe ‘ssh -p 22222’ root@OLD.HOST:/var/www/ /var/www/

Assuming not much time has passed between the first time you ran this and now it should run fairly fast. 

Moment of truth, shut down all web service on the SOURCE server: MySQL, httpd, postfix, etc.  Start all services on the DESTINATION server.  Then change over your DNS to point to the new server.  Assuming everything came up as expected you should be golden!

Test all the sites you had running on the old server paying close attention to things like SSL and 301 redirects handled by the .htaccess files. 

You have now migrated to a new server!  While this seems like a lot of steps when you actually dig down into it there really isn’t much here.  Feel free to leave a comment with your experiences or any questions/comments you may have. 

Raspberry Pints install on other operating systems

There was some interest in a guide to install Raspberry Pints on various other operating systems to I will attempt to provide some guidance here.  If there are any questions please don’t hesitate to ask.

Standard disclaimer here.  I’m not liable if your machine bursts  into flames, your girlfriend leaves you, etc.  Everything here is provided AS IS without any warranty expressed or implied.  Use these instructions at your own risk!

Now that we have that out-of-the-way, let’s begin!  Your first step is to install XAMPP.   I won’t provide instructions here as there is plenty of information on their website.  However if you run into trouble please let me know.

After install you should have a working webserver and MySQL server on your machine.  You can try to access localhost in your web browser to confirm.  If everything is working correctly you need to set a MySQL root password.  You can set a password at this address http://localhost/security/

Next step is to obtain a copy of Raspberry Pints here.   Unzip this and move it to your XAMPP www root.  I opted for a subdirectory of “rp” just because I have multiple websites on my machine.  Make sure that whatever you choose the index.php file is in the root of this subdirectory.

Now you need to run the installer.  Assuming you choose the “rp” directory as above the installer will be accessible here: http://localhost/rp/install but you can adjust as needed.  Follow through the steps using the official instructions for reference.  The password will be the one you set above on the security page.

All done!  To full-screen the display hit F11.  You will have to use the arrow keys to navigate as it appears this isn’t setup for a mouse.  Hopefully this is helpful to someone.  If you encounter any issues please don’t hesitate to let me know so I can assist and adjust this guide as needed. 

Fuck You Chipotle!

Adventurrito, I mean seriously…did you not expect the load on your servers for this 20th anniversary challenge?  For a $445 MAX year prize this is pretty fucking worthless if I have to spend all evening trying to reach the site and send my answer.    Especially if this will continue for the duration of the challenge.  Y’all are going to spend so fucking little in prizes (and obviously the infrastructure isn’t anything to write home about) that this is actually beyond worthless.  Get your shit together or GTFO.

Arduino Temperature/Power Loss Monitoring – Part 1

Since I was unable to find a complete post on this I decided to write one.  I found a lot of good information from other blogs and websites but nothing exactly what I wanted to do.  I’m going to put together what I hope will be a complete guide for home temperature monitoring, power loss, and reporting/graphing (though that last piece will come in a later post).  You are welcome to use any part of this or use it all.  All code is open source under WTFPL.

Basically what I wanted was the ability to monitor the temperature at multiple points in my house.  I also wanted to monitor the power and alert on a power outage.  Since the Arduino board is plugged into a UPS it will stay running for a while after the power goes out.  Obviously without a UPS your Arduino would be unable to alert on a power outage.

First things first, here is a parts list that I used to make this happen:
– 1 Arduino Uno R3
– 1 Arduino Ethernet Shield R3
– 1 4.7k Ohm resistor
– As many DS18B20 temperature sensors as you want.  I will use 4 for this project.
– An LM35 analog sensor from here (optional)
– Wire – I am using spools of 2-pair telco wiring that I’ve had in my basement forever.
– Breadboard for connecting sensors and power to the Arduino board.
– ~9V Power adapter for Arduino board (only if you don’t have USB power nearby)
– 5V SWITCH MODE power adapter for power loss monitoring (optional)
– LAMP web server stack.  If you want to run this on Windows I suggest XAMPP.

(Depending on where you go for this and what you already have you should be able to get away for about $100 US)

Additionally here are the skills you need:
– Basic soldering
– Basic web programming/LAMP skills
– Wire pulling
– Time

Final notes before I get started with the actually how-to.  I will do my best to give enough details on everything (without going overboard) for you to make this work.  If you get stuck though please feel free to leave a comment or message me.  I also give no guarantees on the code.  If it works for you, wonderful.  If it sets your computer on fire, I’m not responsible.  Oh and if your system gets hacked from use of this code also not my responsibility.  All code provided AS-IS.  I am also going to attempt to link to all the places I found parts of this project but I greatly apologize if I don’t cite your source.  There were a TON of different sources so there is a good chance I will miss someone.

Alright – let’s get this show on the road!

Wiring things up is quite straightforward.  I will mention that I’m using an analog sensor on the breadboard just because I was able to get a free sample here.  I would not recommend using these for your whole project as the voltage drop across the cable runs will cause problems.  I just stuck one with the board because it was free so I figured why not.

Alright connecting the DS18B20 sensors is really easy.  We will be using normal mode (instead of parasitic) as this provides more consistency and allows for longer cable runs.

Pinout:

Schematic:

(I didn’t make these, they came from here)

Additionally if you are connecting up an analog sensor as I did the wiring is a slight bit different:

(source)

Alright so things are wired up, now you need the addresses of the digital sensors.  As I found this wonderful post on obtaining the addresses so I will not cover it in the post.  Please see that post for information on getting the addresses of the sensors.

Here are the files you will need to get started: Zipped Files  Obviously the .ino file needs to be uploaded to the Arduino board after inputting the correct server IP , host, mac, and sensor addresses.  The .sql file needs imported into your database.  The write.php needs edited with your DynamicDNS host if you are doing this from home.  You can sign up for an account here.  You just have to make sure one of the devices on your network is updating the IP address.  Additionally if you are writing to a local web server you can remove this part as it’s merely for security to prevent anyone from writing to the database.  After you have edited this file upload it and the lib directory to your web server.

Assuming everything is working correctly you should start seeing temp and power readings in your database:

At this point you should be good to go.  You can watch the output from the Arduino board on your PC to make sure that things are doing what they are supposed to.  Also the logs on your webserver can be useful if you encounter problems.  Hopefully this post helps someone and please feel free to leave a comment with additions/questions/problems!  Thanks for reading!