Tuesday, May 12, 2015

SSL verification does NOT prevent MITM server-to-server attacks

Man-in-the-middle, or MITM, is a specific attack whereby an attacker injects themselves into the communication stream between a recipient of a message and the sender of that message. The most common example on the Internet is between a web browser and a web server. I am not disputing the necessity of verification in that example despite being nearly impossible to detect (e.g. generation/issuance of rogue but legitimate-looking certs to law enforcement). Last-hop MITM defense is an essential component of SSL security even though it is nearly impossible to detect rogue certs. However, server-to-server MITM defense is far more dubious.

Let's suppose I am an attacker for a moment and I find a way to inject myself between two servers in your web application (e.g. web server and database server). What is my behavior? Dumb attackers will simply take the low-brow approach and try to access the communication stream in a continual fashion, which seems to be the use-case that's bandied about for SSL with verification in the server world. However, ask any business owner which is more important to them: Application security or application uptime? Given the choice, the business owner chooses uptime over security every single time. Therefore, as an attacker, I simply let traffic through and only intercept one out of every 1,000-ish requests or for a random 5-10 minutes outside of regular business hours and 20 minutes on weekends. I also use a bogus certificate because my goal isn't to intercept the traffic but rather randomly take down the application over and over and over again for short periods of time. IT and web developers believe that turning on SSL with verification offers better security, so this strategy actually works against them in a MITM scenario because that security reduces application uptime. The first approach (1 out of 1,000 requests) is likely to NEVER be discovered but really irritate and annoy everyone especially both IT and the web development team while the latter is going to be more obvious but infuriating because it wakes multiple people up in IT and on the web development team anywhere from 10 p.m. to 4 a.m. weekly during the night shift and is always followed up shortly with the horrifying words, "never mind, it started working again." Assuming anyone ever finds and fixes the MITM attack, the result is only a bittersweet victory because you can't punch the attacker in the face. I, the attacker, have won by driving you insane and depriving you of much-needed sleep. After all, IT is the modern insane asylum.

SSL verification does NOT defend against the intelligent MITM attack in a server-to-server environment. There is also no real reason to turn it on in the first place. See, most servers sit in a data center. Data centers are generally only one or two hops away from a BGP router (the core Internet routers). The distance between servers is typically 4 to 6 hops (at most) and the amount of traffic that has to be processed at that level is pretty mind-blowing. To conduct a MITM attack, an attacker would have to attack a single application, which is a pointless example, or break the whole data center, at which point your paltry web app is not the only thing that's been violated. Or break a BGP router, which the NSA apparently did when Egypt's Internet access went out a couple of years ago. Supposedly that was a bad firmware modification made by the NSA that hosed the BGP router and took out an entire country's Internet access. Oops. My heroes. (#sarcasm) So the only logical conclusion is that nothing anyone does will ever stop a MITM attack at the server-to-server level and an attacker will more likely break the entire data center the server sits in than an individual application. Because the larger target is far more tempting - as evidenced even by the NSA.

To add one final nail to this coffin, SSL with verification in server-to-server communications also causes problems in setting it up and keeping it up. First you have to construct a PEM certificate chain. Then you have to get the software to actually use the certificate chain, which can be a challenge in its own right. Then you have to reconstruct the PEM certificate chain because you found the wrong certs or they are in the wrong order or you forgot one because no one documents anything. And, when they do document things, the documentation isn't kept up to date. Then the application finally starts working after wasting four hours of your life. Then around 2038, the application suddenly stops working because one of the certs expired and you have to remember how to do it all over again. Is your web app still going to be around that long? You hope not today but...surprise, surprise! Some applications stick around that long. Remember Y2K? Yeah, well, tossing in an expiration date into an application is totally going to come back to haunt you when you are 60+ and have early-onset dementia or Alzheimer's. But you defended against MITM attacks. Or did you? You can't remember and don't actually know...so why actually care until someone makes this extremely simple and easy to do in a way that makes sense?

In a MITM attack scenario: You and your personal sanity, not your web application, are already hosed. So you can turn off SSL with verification now in your web applications and SDKs - it has less maintenance overhead and may even help long-term application uptime and therefore your personal sanity remains intact. In the case of an SDK, you can at least make it an option to turn off verification instead of turning off SSL altogether. Finally, everything surrounding SSL certs needs to be made easier. Way easier.

Friday, May 01, 2015

Portable Apps on a very fresh Windows installation is a bit buggy

I recently reinstalled Windows. Portable Apps shaved off a ton of time during the reinstallation process. It is awesome to return to the DOS days of computing on the modern desktop where each application is self-contained as all applications should be.

However, during my reinstall, running the Portable Apps updater at first resulted in the message, "The downloaded copy of ... is not valid and can not be installed. This could be due to an incomplete download or other network issue. Please try running the updater again when complete." Re-running the updater resulted in the same message. There's something about repeating the same thing again and sanity that could be said here.

I eventually resolved the problem after I realized that I hadn't run Internet Explorer before. So I ran IE for the first time, got through all of the dialog boxes and then shut IE down. After that, the Portable Apps updater worked great. I've always half-figured that IWebBrowser2 was behind the scenes of the updater, which is unfortunately susceptible to IE's weird initialization quirks. Not really a Portable Apps problem but the error message could be adjusted a little bit.

Monday, April 13, 2015

Corel PaintShop Pro X7...still not worth using

I am in the process of finishing up a reinstall of Windows on shiny new hardware. During my adventure, one of the pieces of software I've been needing to upgrade or replace for some time has been Photoshop. I've got CS3 Extended (picked it up for about $300 as an upgrade from 6.0) and, while it serves its purpose, the bugs on newer versions of Windows are incredibly frustrating (even for someone like me who rarely uses it). Adobe Creative Cloud is a non-starter for me for a wide variety of reasons that have already been beaten to death elsewhere on the Internets. I fire up Photoshop about once every 3-4 months, so $120/year is pretty absurd. $1,500+ for CS6 is completely bonkers. Photoshop Elements also doesn't have the feature set I need.

Here is what I depend on in order of most-used:

  1. The color picker in Photoshop. The active response while dragging sliders around and the ability to select specific hue, saturation, and brightness values. The ability to copy and paste hex codes as well. I also need the full dialog (especially the alerts), not the cut-down version of Elements (which lacks alerts).
  2. The vector toolset.
  3. The text toolset.
  4. The style box. Mostly for drop shadows.
  5. Layers.
  6. Layer groups.
  7. Polygon selection tool.
  8. Unlimited undo/redo.
Everything else is almost useless to me. I realize my needs are in the minority, which is why my #2 need is going to have most people who use Photoshop to be all like, "Wait? People actually use that? Why not Illustrator? Or Inkscape?" It's the blend of vector and raster that is needed for web development (e.g. icons) and Photoshop still does that best (IMO). I operate on a fixed-size canvas of these things called pixels. Maybe you've heard of them and this thing called the web before (and possibly print). Yeah, welcome to 2015.

But enough about Photoshop. I was looking for "alternatives to Photoshop" and an article came up with a list of alternates. The typical #1 on the list was GIMP [sigh] - the ONLY graphics editing software that still lives up to its name. (It's 2015 and GIMP still doesn't have real vector layer support?) Now that that's out of the way, a few other non-starters showed up as well. Then PaintShop Pro showed up near the end of the list and I went, "Wait...why does that name sound familiar?" I searched for it and then had a nostalgic flashback from the past. I had used PaintShop Pro 3 WAAAAY back in the day when I couldn't afford much of anything, so I just ran the non-expiring shareware version forever. I remember reading that Corel picked them up and then I never heard about the product again.

Until two days ago. PaintShop Pro is apparently "frequently" compared to Photoshop Elements. Finding useful information on the product is scant (at best) but, for some reason, is covered by PC Magazine - one of the most coveted magazines that software publishers want to get into that also somehow still exists. Unfortunately, the current version of the product (X7 - aka version 14) is severely lacking in the core tools I use and is clearly targeted at the consumer market and still failing to miss the mark - maybe find a different target audience? Like me. Target me. Corel, stick me on your beta testing program.

  1. The color picker is woefully useless and inconsistent everywhere for my needs. There's just something basically perfect about the Photoshop color picker. Select the hue first, then saturation and brightness. Get the right color and intensity in a fraction of the time. Now, PaintShop Pro X7 does have some really interesting things going for their picker such as the complementary selection interface, but the online Adobe color wheel is just as good, if not better, and is free. I am actually a bit surprised to see that fairly advanced feature in PaintShop Pro because it targets people who understand color combinations...a topic that only a very specific group of people understand and had to specifically be trained on. Which makes the next failure utterly confounding.
  2. The vector toolset is broken in a critical way. I use the pen/point vector tool in Photoshop, which is essentially a mashup of PaintShop Pro's freeform and bezier curve tool. (The GIMP pathing tool does it right except there is no way to make a vector layer in GIMP, which makes it a completely useless piece of software.) With Photoshop, I make polygons that sometimes have bezier curves in them. PaintShop Pro's bezier curve tool terminates drawing once a point is placed after a curve is placed - it's really weird and I really did try for several hours to make it work but it is broken. Photoshop and GIMP allow "mixing" the two operations together. The PaintShop Pro vector toolset is not terribly responsive to common keyboard usage either. At this point, my hopes were crushed in finding a reasonable, up-to-date, affordable piece of software (anything under $100 is pretty affordable). On the plus side, text is able to follow an existing vector path, similar to later versions of Photoshop that I don't have access to. So there's that going for it.
  3. Photoshop seems to have better font rendering and adjustment options than PaintShop Pro. Selecting a font in PaintShop Pro X7 is about on par with Photoshop CS3. That said, selecting a single font out of hundreds of fonts for a specific purpose is a time consuming and annoying task that no tool, to date, solves with ANY elegance. And with THAT said, TrueType Fonts (TTF) are a disaster because there aren't any actual Standards for creating fonts that I can see. When I say size "20" in a program like PaintShop Pro or Photoshop, I expect all fonts to fit in the same box on the screen and follow the same rules as all other fonts. Whatever 20 actually means, no one agrees on it, so the end result is that every program renders the same font differently and font authors seem to get to choose what it means to them. And don't get me started on spacing - spacing between letters is so random in most fonts, it makes me wonder if font authors actually test the fonts they make. This is stupid. We should scrap the clearly broken TTF format and start over.
  4. PaintShop Pro's drop shadow dialog leaves a LOT to be desired. Big, heavy, ugly drop shadows. The dialog itself looks like they bolted it onto the side of the product to copy Photoshop back in the late 1990's and never touched it again. Also, the layer styles are buried under a tab. The other styles are limited and need work too.
  5. PaintShop Pro's layers aren't actually that bad. I'm not sure why they have so many types of layers. Maybe for performance reasons? I would only end up using vector and some raster layers, so I guess "hooray" for the extra layer types. I really like the expandable option of vector layers into the individual vector drawings. That's a nice-to-have feature but not absolutely necessary. The thumbnail images seem to be nearest neighbor instead of bicubic scaled.
  6. For some reason I was unable to create a layer group until a few seconds before writing this because the option was previously grayed out (not sure why that was the case). Now that I see them, I'm on the fence because layer groups look just like a regular layer and there could be confusion - but I suspect it would just take time to get used to. Dragging a layer onto a layer group to add it to the group is a bit finicky. However, Photoshop groups are equally finicky. No one, to date, seems to have figured out drag-n-drop support for layer groups which seems odd because groups are very useful and drag-n-drop, from a programming perspective, has existed since Windows 95.
  7. There is no raster polygon selection tool in PaintShop Pro X7. Uh...okay. That makes it much more difficult for me to cut a unicorn out of a picture and slap it on a different background while it holds a heart-shaped spatula in front of a BBQ grill which I, in turn, print onto a custom Home Depot gift card to give to someone. (Yes, that's a thing. That I do. Don't judge it until you've tried it and seen the expression on the recipient's face. Classic.) Freeform and automated selections are "nice" but polygon selection is MUCH better as it is significantly more refined/accurate while, at the same time, requires fewer mouse actions and thus saves my wrist and eyes from cutout tedium.
  8. I didn't test how much undo/redo there is in PaintShop Pro, but thank goodness multiple Ctrl+Z presses means a real undo operation. This is the only area where PaintShop Pro beats Photoshop, which requires doing the finger dance of Ctrl+Alt+Z (aka absolute bologna) to do undo to multiple levels. In Photoshop, Ctrl+Z does a single undo and then another Ctrl+Z undoes the undo - dude, that's what Ctrl+Y is for. The only thing better would be fully rebindable keys so that people could map their keyboard to do whatever is most convenient for them.
The one thing I really love in PaintShop Pro is how files are displayed as a series of tabs instead of the archaic MDI method of display (multiple windows in a window, which is what Photoshop does). I'm a huge fan of tabbed editing in my text editors, so to me it makes sense for image editing too. IMO, Crimson Editor does tabbed editing correctly (among countless other things that it does right). You can keep IntelliJ, Notepad++, Eclipse, Sublime, Emerald Editor (Crimson's failed successor), and the rest of those disasters of text editors. MDI interfaces like those in Photoshop serve little to no purpose. On the tabbed UI front, PaintShop Pro beats Photoshop hands-down. If you need to see two images side-by-side, it is easy to turn off the tabbed interface (Windows -> Tabbed checkbox). Maybe a newer version of Photoshop does the same thing, but the Elements 13 screenshots and videos I've seen don't indicate that anything has changed there.

Anyway...as I said earlier, PaintShop Pro X7 doesn't have a sufficient vector toolset to do serious work in. Their target audience seems to be the consumer crowd. That's a shame because the (rather large) target audience that isn't being catered to by either Corel or Adobe is the web designer. I make do in Photoshop, but with the vector toolset near the bottom of the list of tools and the way it feels when using it, it is clear that Adobe doesn't care about it. Same with PaintShop Pro X7. Vectors seem to take an unfortunate backseat in raster graphics packages. Partly because people don't understand what to do with them (education issue) and partly because the tools are rather weak to manipulate them in what are largely raster programs (a problem which feeds back into the education issue). When you marry vectors with powerful post-processed raster styles (drop shadows, borders, etc.), you get something rather unique: Mostly scalable raster objects that generally look really good at any size. That sort of thing resonates with the web designer and application developer in me.

Saturday, March 28, 2015

Semi-transparent GIF images

Are you ready to return to the 1980's? Uh...me neither. But this post will take you back. And your head will hurt thinking about what I've done to the Internet.

Today, I was looking at this logo on the PHP website trying to figure out why it was so distracting:



And then, after digging around for a bit, I realized that the image was larger than it looked:



Then I realized that the browser was resizing the image to the smaller size. And then I realized that the first image looked a bit anti-aliased, which reminded me that a lot of web browsers will use bicubic scaling to resize larger images. I then realized that this behavior could be abused when applied to a 1-bit alpha channel. Such as the 1-bit alpha channels like those found in GIF images.

GIF is one of the oldest image file formats. It was invented in the late 1980's and is still used on the Internet today for some animations. One of the areas it has traditionally been weak in is transparency. PNG has largely supplanted GIF for static images. MNG (animated PNG) was supposed to be the replacement for animated GIF, but it never took off for a variety of reasons. Regardless, GIF keeps finding ways to come back again and again. Beyond natively supporting 256 colors (there are ways around the 256 color limitation via GIF animation palettes), one of the format's greatest weaknesses is the inability to store semi-transparent data. A pixel is either fully opaque or fully transparent. The end result is that "jaggies" appear along the edges of most GIF images and are particularly noticeable in transparent animated GIFs.

In recent years, browser vendors introduced smooth resizing algorithms into the image presentation core. This happened to make more websites look nicer when resizing images and because the processing power was sufficient to handle the extra load on the CPU. Depending on the browser, display of images may simply be offloaded to the GPU, which is specialized hardware for doing things like resampling image data (i.e. doing this became "free" from a programming perspective). Regardless, this effect is widespread enough that it can now be abused. Here's how:

  1. In an image editor, take a GIF image and stretch it to 11 times its size in height using Nearest Neighbor (not bicubic!).
  2. Delete the appropriate number of pixels to represent the opacity. Do transparent first and solids second on the top half of the image and solids first and transparent second on the bottom half of the image.
  3. Save the file and hardcode the original size in a HTML document.
  4. ...
  5. Profit?
The browser operates on RGBA and will do bicubic scaling on all four channels. Making the GIF 11 times larger gives us alpha options of: 0, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100. So if 8 vertical pixels are of one opaque color and the remaining 3 are transparent, then that will result in the final calculated pixel being 70 percent opaque.

Now some of you might be thinking, "Won't this bloat the file size?" Not necessarily. GIF under the hood uses LZW compression, which attempts to find the longest repeating sequence and compresses the file based on that information. The probability is quite high that the sequence between any two given rows will be identical, which may translate to a mere 10 to 20% increase in overall file size. It might be a good idea for anyone implementing this in software to try both horizontal and vertical scales to see which direction performs the best as far as file sizes go but I'm leaning toward vertical as being better.

Now my pretties, fly and fill the Internets once more with animated GIFs but this time with semi-transparency!

Saturday, January 17, 2015

Quick-n-dirty Postfix and Dovecot setup for an internal notification server

Two of my least-favorite open source products are Postfix and anything that interfaces with it. I use Postfix and Dovecot extensively and still find new annoyances with both of them. My major gripe is the lack of a simple package in Ubuntu that simply asks questions and then automatically does stuff based on the answers to set up a working mail server.

The objective of this post is to show a simple setup of Postfix and Dovecot for Ubuntu 14.04 LTS to enable command-line scripts on a box on a local network to send e-mail to itself and then retrieve those e-mails over POP3 using a normal e-mail client. This is NOT a full-blown e-mail server setup. It is a notification message drop with POP3 access. The e-mail client will only be able to retrieve e-mails from the box (not send them), which completely eliminates the possibility of the box accidentally turning into a remote mail relay.

First, install Postfix and Dovecot:

sudo apt-get install postfix
sudo apt-get install dovecot-pop3d
(I don't recommend the 'mail-stack-delivery' package. It includes IMAP support, which I'm not a fan of for something so basic. Only install the packages you need. I installed 'mail-stack-delivery' and decided it wasn't the right choice and cleaning up the garbage it left behind took longer than it should have.)

Next, set up a user account for handling the e-mails:

sudo adduser cronmail
Name the account to be whatever you want.

Edit '/etc/postfix/main.cf'. Change these options:

inet_interfaces = 127.0.0.1
mailbox_command = /usr/lib/dovecot/dovecot-lda -f "$SENDER" -a "$RECIPIENT"
That isolates SMTP to localhost only access and routes incoming messages to Dovecot. That last part is important or there will be major hair-pulling as messages get routed to the wrong place.

Next, verify the Dovecot configuration with 'dovecot -n'. If anything is off, visit '/etc/dovecot/conf.d/' and use grep to find the offending bits.

Restart Postfix and Dovecot:

sudo postfix reload
sudo restart dovecot
Now verify that only the ports you want open are open and that SMTP is only available on 127.0.0.1:

sudo netstat -anpt
Once you are satisfied, set up your e-mail client and use the new user you set up (cronmail@yourservername).

Finally, send an e-mail to the new e-mail account. If you use the Ultimate E-mail Toolkit, it is important to remember to disable the DNS lookup via 'usedns':

$headers = SMTP::GetUserAgent("Thunderbird");

$options = array(
 "headers" => $headers,
 "textmessage" => $message . "\n\nSent by [yourservername]",
 "server" => "localhost",
 "secure" => false,
 "usedns" => false,
 "username" => "cronmail"
 "password" => "*******"
);

SMTP::SendEmail("cronmail@yourservername", "cronmail@yourservername", "[yourservername] Some Notification", $options);
And that's it! Hopefully this saves someone a few hours. It isn't a full-blown mail server.

Wednesday, January 14, 2015

Do code overviews NOT code reviews!

Programmers dread the "code review". This is where the programmer sits down with his or her peers and their peers bash their fellow programmer's code - and, in not so many words, tell the programmer that they are a terrible person. The code review is about ego boosting/ego crushing disguised as a quality assurance practice. Well, that's what happens in a lot of code reviews and, when it happens, it is a form of bullying. The word "review" implies that the code is being judged:

Review: "a formal assessment or examination of something with the possibility or intention of instituting change if necessary."

Judged: "form an opinion or conclusion about" or "give a verdict on someone."

Also, the code review involves not only judgment but the peers are the jury and executioner in some sort of twisted intervention:

Intervention: "an occasion on which a person with an addiction or other behavioral problem is confronted by a group of friends or family members in an attempt to persuade them to address the issue."

Code reviews are the equivalent of walking into the principal's office in school and getting flogged by a student. Everyone dreads the idea, so why do them? Code reviews are actually from the 1970's (no joke!), which means we need to seriously question their relevance. They are outdated and have no place in the modern workforce. Instead, we need to be doing the more modern "code overview".

What is a code overview? Basically, it is an overview of the design of any project that exceeds the general Norris Number of the rest of the team so that the team simply knows about the structure of the project - regardless of whether or not they agree with the overall design. An example is a team that typically develops and maintains 2,000 line apps and needs to know about a 20,000 line app that's been developed, then that's the time for a code overview. The goal of a code overview is to raise the Bus Factor - not point out faults in the code or its design. The bus factor on most projects typically hangs around one in smaller teams. That is, if that one person gets hit by a bus and is killed/maimed, the project will become a hopelessly lost cause from the business perspective of getting new features added to it. Simply knowing the structure of a project is sufficient for a reasonably capable programmer to carry on the work. After all, half the battle of updating a project that a programmer has never touched before is figuring out its structure.

How does one conduct a code overview? Well, it is more of a presentation style to disseminate information. When you think "presentation", you probably think PowerPoint with notes made in Word or similar software and that's exactly what a code overview is. One person shows the work that has been done and other people sit in and learn. It's a formal environment designed to guarantee dissemination of important information. In many cases, it only needs to be 10-15 minutes long because it is intended for a similarly technical audience of peers. The key is to remember that it isn't about putting on a show. What files sit where and what those files do or a database schema with the most important tables highlighted are just two topic ideas for a code overview. The presenter may not even necessarily open up those files, look at any code(!), or look at the database data. Just knowing the structure is generally sufficient. Then make the content of the presentation available in a Word document (or equivalent - e.g. Google Doc). The goal is to get more people up to speed just enough to be able to go in and change a project should the need ever arise. Hopefully it won't, but it's nice to cover the bases.

As a side-benefit, code overviews will happen rarely and mostly toward when the project hits the maintenance phase of development. This allows everyone to focus on their work and occasionally learn about other projects, which saves the organization time and money down the road. It also fosters healthy interactions within the team and smart managers see it as an opportunity to leverage the budget to buy food and simultaneously accomplish other important organizational tasks during the meeting.

Code reviews generally breed a negative, unhealthy atmosphere and nearly everyone going into one has the wrong mindset or gets sucked into the wrong mindset during the review and it turns into a form of bullying in the workplace. As long as the code works and does what it is supposed to do, who cares what it looks like? Code overviews, on the other hand, are a harmless but essential part to creating continuity within an organization.

Saturday, September 27, 2014

The Encrypted File Storage System (EFSS) saved my butt

I've been testing a beta product called the Encrypted File Storage System (or EFSS) in a production environment for a while as an offsite backup solution. I'm only backing up 200MB of compressed data, but it works very well. Most backup solutions rely on pushing or pulling the data across a network. EFSS puts data to be backed up into an emulated file system locally and transparently encrypts and compresses the data. Each 4KB block of data has a timestamp associated with it, which makes incrementals over a network blazing fast - faster than anything else I've used.

Yesterday I was monkeying around on my server and removed a few installed packages that I didn't need any more. Unfortunately, removing those packages caused a critical configuration file to become corrupted. I then fired up the EFSS command-line shell and mounted everything except the last 24 hours of incrementals and exported the configuration out to the file system. I did an eyeball diff, checked a few things on the file system, and restored the configuration back to what it had been. If I didn't have an EFSS-based backup (e.g. had been using rsync), I would have had to rebuild the configuration from scratch and that could have taken several days instead of a few minutes.

You know your backup solution is working when you can rapidly recover from a data loss. EFSS also recently helped me to move a functional website from one server to another with permissions, owners, groups, and timestamps completely intact in a fraction of the time it would have taken me using other backup systems.