Wednesday, July 22, 2015

Solving "unresolved external symbol ___report_rangecheckfailure" Visual Studio linker errors

Let's say you import a library from Visual Studio 2012 or later into your project in an older version of Visual Studio (e.g. Visual Studio 2008 or Visual Studio 2010) but now get linker errors:

error LNK2019: unresolved external symbol ___report_rangecheckfailure referenced in function ...
error LNK2001: unresolved external symbol ___report_rangecheckfailure ...

Sad day. Especially since you don't really get a say in how that library is being built. Your options are:

  1. Upgrade your version of Visual Studio. That includes going through the whole project upgrade cycle. We know how well that usually goes.
  2. Recompile the library yourself. Sad day turns into sad week.
  3. Hack it.
The function __report_rangecheckfailure() is called when the /GS compiler option is used. The option enabled buffer overflow security cookie checking, which, in this day and age, is a good option to have enabled. Unfortunately, that causes problems with older versions of Visual Studio. Let's take a look at what the function does - the source code from 'VC\crt\src\gs_report.c' has this code:

// Declare stub for rangecheckfailure, since these occur often enough that the code bloat
// of setting up the parameters hurts performance
__declspec(noreturn) void __cdecl __report_rangecheckfailure(void)
{
    __report_securityfailure(FAST_FAIL_RANGE_CHECK_FAILURE);
}
Hmm...not really helpful since it calls another function. However, that function contains this very interesting comment after a lot of inline assembler and macros:

    /*
     * Raise the security failure by passing it to the unhandled exception
     * filter and then terminate the process.
     */
So, knowing this normally triggers an unhandled exception and exits the process, we can hack it:

__declspec(noreturn) void __cdecl __report_rangecheckfailure(void)
{
    ::ExitProcess(1);
}
I'm not sure whether to congratulate myself on this evil hack or cry. I think I'll do a little of both. You're welcome.

Oh, and if you work on the Microsoft Visual Studio development team, please develop a compatibility library that implements stuff correctly for older Visual Studio environments. Doesn't have to go back to 1995 VS6, but something reasonable like a 10 year window that addresses issues like these.

Monday, July 06, 2015

The Death Master File...a blackhat's dream come true

First, watch this CBS 60 Minutes special on the Social Security Administration's Death Master File:



The ultimate hack, from a blackhat/rogue government perspective, is the one that has significant negative impact on the financial stability of a country and...no one can figure out who is responsible.

The Death Master File meets all of the prerequisite criteria:

  • Large quantities of data? Check.
  • Has significant financial consequences for anyone who gets into it? Check.
  • Individuals can't readily find out if they are on the file or not? Check.
  • Relatively easy to add anyone to the file? Probably check (e.g. plop some malware on funeral home computers and get remote access to adding entries to the file).
  • Takes years to get off the file? Check.
  • Has recurrent consequences for the rest of the individual's life? Check.
  • No way to track additions back to the original source? Check.
  • The head honcho at the Social Security Administration doesn't really care about "accidental" additions and only seems to care about paying out too much money? Check.

You really couldn't ask for a more perfect combination. It's pretty shocking when you think about it - zero safeguards, no one seems to care, and it has major repercussions for affected individuals (e.g. homelessness). Destroying the U.S. is quite literally available on an unprotected digital silver platter. There are so many different ways that this could go sideways I'm not really sure where to start other than to write a blog post about it to raise awareness.

As a software developer, the one thing that REALLY irks me is this:

https://dmf.ntis.gov/

There is a $200 annual subscription fee to access the data and is restricted to government entities and businesses with a need for the data. Individuals can't write a script to watch for the unfortunate event of being added to the list. The list is supposedly a very lucrative source of income, which means that every business out there seems to use it. Sorry, but my tax dollars aren't for NTIS to run an e-commerce store. Data for all or data for none.

The U.S. government is ill-equipped to handle modern threats - writing laws and charging money for access to the data doesn't close blatant security holes. Who was the person who decided to not bother with change tracking in this rather critical database? That's ridiculous and they should be fired and drop-kicked out the door. Also, to simply not care about those people whose lives the Social Security Administration has messed up is rather messed up too. The U.S. has enemies who would love nothing more than to destroy the country. Adding people to the Death Master File seems like a pretty easy way to accomplish such a task.

Saturday, June 20, 2015

How to call select() - the CORRECT way!

There is a TON of broken code out on the Internet with lots of programmers who enter the world of TCP/IP socket development and think they have figured out how to write socket code. They then disseminate their broken code to others who, in turn, disseminate broken code to other people.

One of the most egregious problems plaguing the world of software development today is the use, or abuse, of select(). Today, you are going to contribute to fixing this problem once and for all by reading what I have to say and then ingraining it into your brainz.

There are two types of file descriptors/sockets/what-have-you:

Blocking and non-blocking. Sometimes referred to as synchronous and asynchronous.

If you are using select() on synchronous sockets in your code, you are doing it wrong!

select() is ONLY for asynchronous sockets. Think of it this way: A synchronous socket is you telling the OS that you know the exact order of operations on that socket (e.g. POP3) and are willing to wait until hell freezes over for that read/write operation to complete.

Read that over again and you should come to the same conclusion: Calling select() on a synchronous socket is WRONG. Although, if you've been doing it wrong for decades, this fact becomes a lot harder to accept.

Where does this misunderstanding come from? A lot of people misunderstand select() because the book/teacher/website they learned *NIX Socket Programming from got it wrong because they learned the wrong approach from someone else. select() on a synchronous socket introduces bugs that are hard to trace and happen randomly. Also, most socket programmers start out using synchronous sockets with simple client-side applications and then later want a way to handle multiple sockets at one time. select() is referenced all over the manpages/MSDN and, when the programmer reads about it, it sounds like it will work, so they try it and it seems to work. That's the real problem: select() seems to work, which is why programmers use it improperly.

select()'s purpose in life is to offer a way to not have a busy loop in an asynchronous environment since no read/write operation will ever block. It is entirely possible, if you pass in a synchronous descriptor to select(), that select() will indicate that the socket is readable but when you go to read data, the synchronous socket will block. You might say that can't possibly happen but guess again...it does happen! This is why select() being only for asynchronous sockets makes much more sense. Once you learn this, the code for asynchronous sockets becomes surprisingly cleaner and is only marginally more complex than synchronous socket code. If you ever thought your synchronous socket code using select() was kind of hacky/messy, then you now know why. This is a harsh lesson to learn for many people.

Therefore, to process more than one descriptor per thread at one time, use asynchronous descriptors and select().

The best way to fix this entire problem would be for select() to determine if a descriptor is synchronous and then reject it outright. Such a solution would break every major application and then, lo-and-behold, everyone would fix their code! The world would then have less broken code and we'd all be happier.

Friday, June 19, 2015

Dear WebSocket, 1980 called and wants its text mode back among other things

This is a mostly tongue-in-cheek response to RFC 6455, which defines the WebSocket protocol, which I recently built a client for and can be found in the Ultimate Web Scraper Toolkit. Certain things annoyed me.

Dear WebSocket,

FTP called (RFC 765, circa 1980) and wants its text mode back. Please return it to the nearest Internet Engineering Task Force (IETF) member as soon as possible. You may have shined it up a bit with UTF-8, which was basically designed on a napkin. Of course, Unicode has never had any implementation problems with it whatsoever. Ever.

Your masking key is for clients only and not even being optional for servers defies the core Internet tenet of being liberal with what you accept, strict with what you send. Technically, both are peers and therefore both are clients of each other since you are, after all, bi-directional and the client could easily function as a server after the connection is established. Because this has absolutely never ever been done before.

Your comments about the Origin header only being sent by web browsers enables differentiation from an automated script and a web browser. Or does it? No, it does not. For I can simply send to the target whatever Origin header I want and therefore look like any web browser of my choosing. Your false defenses are no match for my Kung Fu.

Your reserved bits per frame are silly and probably full of arbitrary limitations. From the comfort of my keyboard and screen, I fart in your general direction.

Your required closing packets are unnecessary and most likely a security risk with anyone assuming that such frames will ever be sent. TCP already has built-in shutdown mechanisms. Ping/pong frames are more than sufficient.

Since it is important to always remember the childrens, you get an F--. Enjoy.


Insincerely,

The portion of the Internet that actually knows how to design rational communication protocols.

Tuesday, May 12, 2015

SSL verification does NOT prevent MITM server-to-server attacks

Man-in-the-middle, or MITM, is a specific attack whereby an attacker injects themselves into the communication stream between a recipient of a message and the sender of that message. The most common example on the Internet is between a web browser and a web server. I am not disputing the necessity of verification in that example despite being nearly impossible to detect (e.g. generation/issuance of rogue but legitimate-looking certs to law enforcement). Last-hop MITM defense is an essential component of SSL security even though it is nearly impossible to detect rogue certs. However, server-to-server MITM defense is far more dubious.

Let's suppose I am an attacker for a moment and I find a way to inject myself between two servers in your web application (e.g. web server and database server). What is my behavior? Dumb attackers will simply take the low-brow approach and try to access the communication stream in a continual fashion, which seems to be the use-case that's bandied about for SSL with verification in the server world. However, ask any business owner which is more important to them: Application security or application uptime? Given the choice, the business owner chooses uptime over security every single time. Therefore, as an attacker, I simply let traffic through and only intercept one out of every 1,000-ish requests or for a random 5-10 minutes outside of regular business hours and 20 minutes on weekends. I also use a bogus certificate because my goal isn't to intercept the traffic but rather randomly take down the application over and over and over again for short periods of time. IT and web developers believe that turning on SSL with verification offers better security, so this strategy actually works against them in a MITM scenario because that security reduces application uptime. The first approach (1 out of 1,000 requests) is likely to NEVER be discovered but really irritate and annoy everyone especially both IT and the web development team while the latter is going to be more obvious but infuriating because it wakes multiple people up in IT and on the web development team anywhere from 10 p.m. to 4 a.m. weekly during the night shift and is always followed up shortly with the horrifying words, "never mind, it started working again." Assuming anyone ever finds and fixes the MITM attack, the result is only a bittersweet victory because you can't punch the attacker in the face. I, the attacker, have won by driving you insane and depriving you of much-needed sleep. After all, IT is the modern insane asylum.

SSL verification does NOT defend against the intelligent MITM attack in a server-to-server environment. There is also no real reason to turn it on in the first place. See, most servers sit in a data center. Data centers are generally only one or two hops away from a BGP router (the core Internet routers). The distance between servers is typically 4 to 6 hops (at most) and the amount of traffic that has to be processed at that level is pretty mind-blowing. To conduct a MITM attack, an attacker would have to attack a single application, which is a pointless example, or break the whole data center, at which point your paltry web app is not the only thing that's been violated. Or break a BGP router, which the NSA apparently did when Egypt's Internet access went out a couple of years ago. Supposedly that was a bad firmware modification made by the NSA that hosed the BGP router and took out an entire country's Internet access. Oops. My heroes. (#sarcasm) So the only logical conclusion is that nothing anyone does will ever stop a MITM attack at the server-to-server level and an attacker will more likely break the entire data center the server sits in than an individual application. Because the larger target is far more tempting - as evidenced even by the NSA.

To add one final nail to this coffin, SSL with verification in server-to-server communications also causes problems in setting it up and keeping it up. First you have to construct a PEM certificate chain. Then you have to get the software to actually use the certificate chain, which can be a challenge in its own right. Then you have to reconstruct the PEM certificate chain because you found the wrong certs or they are in the wrong order or you forgot one because no one documents anything. And, when they do document things, the documentation isn't kept up to date. Then the application finally starts working after wasting four hours of your life. Then around 2038, the application suddenly stops working because one of the certs expired and you have to remember how to do it all over again. Is your web app still going to be around that long? You hope not today but...surprise, surprise! Some applications stick around that long. Remember Y2K? Yeah, well, tossing in an expiration date into an application is totally going to come back to haunt you when you are 60+ and have early-onset dementia or Alzheimer's. But you defended against MITM attacks. Or did you? You can't remember and don't actually know...so why actually care until someone makes this extremely simple and easy to do in a way that makes sense?

In a MITM attack scenario: You and your personal sanity, not your web application, are already hosed. So you can turn off SSL with verification now in your web applications and SDKs - it has less maintenance overhead and may even help long-term application uptime and therefore your personal sanity remains intact. In the case of an SDK, you can at least make it an option to turn off verification instead of turning off SSL altogether. Finally, everything surrounding SSL certs needs to be made easier. Way easier.

Friday, May 01, 2015

Portable Apps on a very fresh Windows installation is a bit buggy

I recently reinstalled Windows. Portable Apps shaved off a ton of time during the reinstallation process. It is awesome to return to the DOS days of computing on the modern desktop where each application is self-contained as all applications should be.

However, during my reinstall, running the Portable Apps updater at first resulted in the message, "The downloaded copy of ... is not valid and can not be installed. This could be due to an incomplete download or other network issue. Please try running the updater again when complete." Re-running the updater resulted in the same message. There's something about repeating the same thing again and sanity that could be said here.

I eventually resolved the problem after I realized that I hadn't run Internet Explorer before. So I ran IE for the first time, got through all of the dialog boxes and then shut IE down. After that, the Portable Apps updater worked great. I've always half-figured that IWebBrowser2 was behind the scenes of the updater, which is unfortunately susceptible to IE's weird initialization quirks. Not really a Portable Apps problem but the error message could be adjusted a little bit.

Monday, April 13, 2015

Corel PaintShop Pro X7...still not worth using

I am in the process of finishing up a reinstall of Windows on shiny new hardware. During my adventure, one of the pieces of software I've been needing to upgrade or replace for some time has been Photoshop. I've got CS3 Extended (picked it up for about $300 as an upgrade from 6.0) and, while it serves its purpose, the bugs on newer versions of Windows are incredibly frustrating (even for someone like me who rarely uses it). Adobe Creative Cloud is a non-starter for me for a wide variety of reasons that have already been beaten to death elsewhere on the Internets. I fire up Photoshop about once every 3-4 months, so $120/year is pretty absurd. $1,500+ for CS6 is completely bonkers. Photoshop Elements also doesn't have the feature set I need.

Here is what I depend on in order of most-used:

  1. The color picker in Photoshop. The active response while dragging sliders around and the ability to select specific hue, saturation, and brightness values. The ability to copy and paste hex codes as well. I also need the full dialog (especially the alerts), not the cut-down version of Elements (which lacks alerts).
  2. The vector toolset.
  3. The text toolset.
  4. The style box. Mostly for drop shadows.
  5. Layers.
  6. Layer groups.
  7. Polygon selection tool.
  8. Unlimited undo/redo.
Everything else is almost useless to me. I realize my needs are in the minority, which is why my #2 need is going to have most people who use Photoshop to be all like, "Wait? People actually use that? Why not Illustrator? Or Inkscape?" It's the blend of vector and raster that is needed for web development (e.g. icons) and Photoshop still does that best (IMO). I operate on a fixed-size canvas of these things called pixels. Maybe you've heard of them and this thing called the web before (and possibly print). Yeah, welcome to 2015.

But enough about Photoshop. I was looking for "alternatives to Photoshop" and an article came up with a list of alternates. The typical #1 on the list was GIMP [sigh] - the ONLY graphics editing software that still lives up to its name. (It's 2015 and GIMP still doesn't have real vector layer support?) Now that that's out of the way, a few other non-starters showed up as well. Then PaintShop Pro showed up near the end of the list and I went, "Wait...why does that name sound familiar?" I searched for it and then had a nostalgic flashback from the past. I had used PaintShop Pro 3 WAAAAY back in the day when I couldn't afford much of anything, so I just ran the non-expiring shareware version forever. I remember reading that Corel picked them up and then I never heard about the product again.

Until two days ago. PaintShop Pro is apparently "frequently" compared to Photoshop Elements. Finding useful information on the product is scant (at best) but, for some reason, is covered by PC Magazine - one of the most coveted magazines that software publishers want to get into that also somehow still exists. Unfortunately, the current version of the product (X7 - aka version 14) is severely lacking in the core tools I use and is clearly targeted at the consumer market and still failing to miss the mark - maybe find a different target audience? Like me. Target me. Corel, stick me on your beta testing program.

  1. The color picker is woefully useless and inconsistent everywhere for my needs. There's just something basically perfect about the Photoshop color picker. Select the hue first, then saturation and brightness. Get the right color and intensity in a fraction of the time. Now, PaintShop Pro X7 does have some really interesting things going for their picker such as the complementary selection interface, but the online Adobe color wheel is just as good, if not better, and is free. I am actually a bit surprised to see that fairly advanced feature in PaintShop Pro because it targets people who understand color combinations...a topic that only a very specific group of people understand and had to specifically be trained on. Which makes the next failure utterly confounding.
  2. The vector toolset is broken in a critical way. I use the pen/point vector tool in Photoshop, which is essentially a mashup of PaintShop Pro's freeform and bezier curve tool. (The GIMP pathing tool does it right except there is no way to make a vector layer in GIMP, which makes it a completely useless piece of software.) With Photoshop, I make polygons that sometimes have bezier curves in them. PaintShop Pro's bezier curve tool terminates drawing once a point is placed after a curve is placed - it's really weird and I really did try for several hours to make it work but it is broken. Photoshop and GIMP allow "mixing" the two operations together. The PaintShop Pro vector toolset is not terribly responsive to common keyboard usage either. At this point, my hopes were crushed in finding a reasonable, up-to-date, affordable piece of software (anything under $100 is pretty affordable). On the plus side, text is able to follow an existing vector path, similar to later versions of Photoshop that I don't have access to. So there's that going for it.
  3. Photoshop seems to have better font rendering and adjustment options than PaintShop Pro. Selecting a font in PaintShop Pro X7 is about on par with Photoshop CS3. That said, selecting a single font out of hundreds of fonts for a specific purpose is a time consuming and annoying task that no tool, to date, solves with ANY elegance. And with THAT said, TrueType Fonts (TTF) are a disaster because there aren't any actual Standards for creating fonts that I can see. When I say size "20" in a program like PaintShop Pro or Photoshop, I expect all fonts to fit in the same box on the screen and follow the same rules as all other fonts. Whatever 20 actually means, no one agrees on it, so the end result is that every program renders the same font differently and font authors seem to get to choose what it means to them. And don't get me started on spacing - spacing between letters is so random in most fonts, it makes me wonder if font authors actually test the fonts they make. This is stupid. We should scrap the clearly broken TTF format and start over.
  4. PaintShop Pro's drop shadow dialog leaves a LOT to be desired. Big, heavy, ugly drop shadows. The dialog itself looks like they bolted it onto the side of the product to copy Photoshop back in the late 1990's and never touched it again. Also, the layer styles are buried under a tab. The other styles are limited and need work too.
  5. PaintShop Pro's layers aren't actually that bad. I'm not sure why they have so many types of layers. Maybe for performance reasons? I would only end up using vector and some raster layers, so I guess "hooray" for the extra layer types. I really like the expandable option of vector layers into the individual vector drawings. That's a nice-to-have feature but not absolutely necessary. The thumbnail images seem to be nearest neighbor instead of bicubic scaled.
  6. For some reason I was unable to create a layer group until a few seconds before writing this because the option was previously grayed out (not sure why that was the case). Now that I see them, I'm on the fence because layer groups look just like a regular layer and there could be confusion - but I suspect it would just take time to get used to. Dragging a layer onto a layer group to add it to the group is a bit finicky. However, Photoshop groups are equally finicky. No one, to date, seems to have figured out drag-n-drop support for layer groups which seems odd because groups are very useful and drag-n-drop, from a programming perspective, has existed since Windows 95.
  7. There is no raster polygon selection tool in PaintShop Pro X7. Uh...okay. That makes it much more difficult for me to cut a unicorn out of a picture and slap it on a different background while it holds a heart-shaped spatula in front of a BBQ grill which I, in turn, print onto a custom Home Depot gift card to give to someone. (Yes, that's a thing. That I do. Don't judge it until you've tried it and seen the expression on the recipient's face. Classic.) Freeform and automated selections are "nice" but polygon selection is MUCH better as it is significantly more refined/accurate while, at the same time, requires fewer mouse actions and thus saves my wrist and eyes from cutout tedium.
  8. I didn't test how much undo/redo there is in PaintShop Pro, but thank goodness multiple Ctrl+Z presses means a real undo operation. This is the only area where PaintShop Pro beats Photoshop, which requires doing the finger dance of Ctrl+Alt+Z (aka absolute bologna) to do undo to multiple levels. In Photoshop, Ctrl+Z does a single undo and then another Ctrl+Z undoes the undo - dude, that's what Ctrl+Y is for. The only thing better would be fully rebindable keys so that people could map their keyboard to do whatever is most convenient for them.
The one thing I really love in PaintShop Pro is how files are displayed as a series of tabs instead of the archaic MDI method of display (multiple windows in a window, which is what Photoshop does). I'm a huge fan of tabbed editing in my text editors, so to me it makes sense for image editing too. IMO, Crimson Editor does tabbed editing correctly (among countless other things that it does right). You can keep IntelliJ, Notepad++, Eclipse, Sublime, Emerald Editor (Crimson's failed successor), and the rest of those disasters of text editors. MDI interfaces like those in Photoshop serve little to no purpose. On the tabbed UI front, PaintShop Pro beats Photoshop hands-down. If you need to see two images side-by-side, it is easy to turn off the tabbed interface (Windows -> Tabbed checkbox). Maybe a newer version of Photoshop does the same thing, but the Elements 13 screenshots and videos I've seen don't indicate that anything has changed there.

Anyway...as I said earlier, PaintShop Pro X7 doesn't have a sufficient vector toolset to do serious work in. Their target audience seems to be the consumer crowd. That's a shame because the (rather large) target audience that isn't being catered to by either Corel or Adobe is the web designer. I make do in Photoshop, but with the vector toolset near the bottom of the list of tools and the way it feels when using it, it is clear that Adobe doesn't care about it. Same with PaintShop Pro X7. Vectors seem to take an unfortunate backseat in raster graphics packages. Partly because people don't understand what to do with them (education issue) and partly because the tools are rather weak to manipulate them in what are largely raster programs (a problem which feeds back into the education issue). When you marry vectors with powerful post-processed raster styles (drop shadows, borders, etc.), you get something rather unique: Mostly scalable raster objects that generally look really good at any size. That sort of thing resonates with the web designer and application developer in me.