Saturday, January 17, 2015

Quick-n-dirty Postfix and Dovecot setup for an internal notification server

Two of my least-favorite open source products are Postfix and anything that interfaces with it. I use Postfix and Dovecot extensively and still find new annoyances with both of them. My major gripe is the lack of a simple package in Ubuntu that simply asks questions and then automatically does stuff based on the answers to set up a working mail server.

The objective of this post is to show a simple setup of Postfix and Dovecot for Ubuntu 14.04 LTS to enable command-line scripts on a box on a local network to send e-mail to itself and then retrieve those e-mails over POP3 using a normal e-mail client. This is NOT a full-blown e-mail server setup. It is a notification message drop with POP3 access. The e-mail client will only be able to retrieve e-mails from the box (not send them), which completely eliminates the possibility of the box accidentally turning into a remote mail relay.

First, install Postfix and Dovecot:

sudo apt-get install postfix
sudo apt-get install dovecot-pop3d
(I don't recommend the 'mail-stack-delivery' package. It includes IMAP support, which I'm not a fan of for something so basic. Only install the packages you need. I installed 'mail-stack-delivery' and decided it wasn't the right choice and cleaning up the garbage it left behind took longer than it should have.)

Next, set up a user account for handling the e-mails:

sudo adduser cronmail
Name the account to be whatever you want.

Edit '/etc/postfix/main.cf'. Change these options:

inet_interfaces = 127.0.0.1
mailbox_command = /usr/lib/dovecot/dovecot-lda -f "$SENDER" -a "$RECIPIENT"
That isolates SMTP to localhost only access and routes incoming messages to Dovecot. That last part is important or there will be major hair-pulling as messages get routed to the wrong place.

Next, verify the Dovecot configuration with 'dovecot -n'. If anything is off, visit '/etc/dovecot/conf.d/' and use grep to find the offending bits.

Restart Postfix and Dovecot:

sudo postfix reload
sudo restart dovecot
Now verify that only the ports you want open are open and that SMTP is only available on 127.0.0.1:

sudo netstat -anpt
Once you are satisfied, set up your e-mail client and use the new user you set up (cronmail@yourservername).

Finally, send an e-mail to the new e-mail account. If you use the Ultimate E-mail Toolkit, it is important to remember to disable the DNS lookup via 'usedns':

$headers = SMTP::GetUserAgent("Thunderbird");

$options = array(
 "headers" => $headers,
 "textmessage" => $message . "\n\nSent by [yourservername]",
 "server" => "localhost",
 "secure" => false,
 "usedns" => false,
 "username" => "cronmail"
 "password" => "*******"
);

SMTP::SendEmail("cronmail@yourservername", "cronmail@yourservername", "[yourservername] Some Notification", $options);
And that's it! Hopefully this saves someone a few hours. It isn't a full-blown mail server.

Wednesday, January 14, 2015

Do code overviews NOT code reviews!

Programmers dread the "code review". This is where the programmer sits down with his or her peers and their peers bash their fellow programmer's code - and, in not so many words, tell the programmer that they are a terrible person. The code review is about ego boosting/ego crushing disguised as a quality assurance practice. Well, that's what happens in a lot of code reviews and, when it happens, it is a form of bullying. The word "review" implies that the code is being judged:

Review: "a formal assessment or examination of something with the possibility or intention of instituting change if necessary."

Judged: "form an opinion or conclusion about" or "give a verdict on someone."

Also, the code review involves not only judgment but the peers are the jury and executioner in some sort of twisted intervention:

Intervention: "an occasion on which a person with an addiction or other behavioral problem is confronted by a group of friends or family members in an attempt to persuade them to address the issue."

Code reviews are the equivalent of walking into the principal's office in school and getting flogged by a student. Everyone dreads the idea, so why do them? Code reviews are actually from the 1970's (no joke!), which means we need to seriously question their relevance. They are outdated and have no place in the modern workforce. Instead, we need to be doing the more modern "code overview".

What is a code overview? Basically, it is an overview of the design of any project that exceeds the general Norris Number of the rest of the team so that the team simply knows about the structure of the project - regardless of whether or not they agree with the overall design. An example is a team that typically develops and maintains 2,000 line apps and needs to know about a 20,000 line app that's been developed, then that's the time for a code overview. The goal of a code overview is to raise the Bus Factor - not point out faults in the code or its design. The bus factor on most projects typically hangs around one in smaller teams. That is, if that one person gets hit by a bus and is killed/maimed, the project will become a hopelessly lost cause from the business perspective of getting new features added to it. Simply knowing the structure of a project is sufficient for a reasonably capable programmer to carry on the work. After all, half the battle of updating a project that a programmer has never touched before is figuring out its structure.

How does one conduct a code overview? Well, it is more of a presentation style to disseminate information. When you think "presentation", you probably think PowerPoint with notes made in Word or similar software and that's exactly what a code overview is. One person shows the work that has been done and other people sit in and learn. It's a formal environment designed to guarantee dissemination of important information. In many cases, it only needs to be 10-15 minutes long because it is intended for a similarly technical audience of peers. The key is to remember that it isn't about putting on a show. What files sit where and what those files do or a database schema with the most important tables highlighted are just two topic ideas for a code overview. The presenter may not even necessarily open up those files, look at any code(!), or look at the database data. Just knowing the structure is generally sufficient. Then make the content of the presentation available in a Word document (or equivalent - e.g. Google Doc). The goal is to get more people up to speed just enough to be able to go in and change a project should the need ever arise. Hopefully it won't, but it's nice to cover the bases.

As a side-benefit, code overviews will happen rarely and mostly toward when the project hits the maintenance phase of development. This allows everyone to focus on their work and occasionally learn about other projects, which saves the organization time and money down the road. It also fosters healthy interactions within the team and smart managers see it as an opportunity to leverage the budget to buy food and simultaneously accomplish other important organizational tasks during the meeting.

Code reviews generally breed a negative, unhealthy atmosphere and nearly everyone going into one has the wrong mindset or gets sucked into the wrong mindset during the review and it turns into a form of bullying in the workplace. As long as the code works and does what it is supposed to do, who cares what it looks like? Code overviews, on the other hand, are a harmless but essential part to creating continuity within an organization.

Saturday, September 27, 2014

The Encrypted File Storage System (EFSS) saved my butt

I've been testing a beta product called the Encrypted File Storage System (or EFSS) in a production environment for a while as an offsite backup solution. I'm only backing up 200MB of compressed data, but it works very well. Most backup solutions rely on pushing or pulling the data across a network. EFSS puts data to be backed up into an emulated file system locally and transparently encrypts and compresses the data. Each 4KB block of data has a timestamp associated with it, which makes incrementals over a network blazing fast - faster than anything else I've used.

Yesterday I was monkeying around on my server and removed a few installed packages that I didn't need any more. Unfortunately, removing those packages caused a critical configuration file to become corrupted. I then fired up the EFSS command-line shell and mounted everything except the last 24 hours of incrementals and exported the configuration out to the file system. I did an eyeball diff, checked a few things on the file system, and restored the configuration back to what it had been. If I didn't have an EFSS-based backup (e.g. had been using rsync), I would have had to rebuild the configuration from scratch and that could have taken several days instead of a few minutes.

You know your backup solution is working when you can rapidly recover from a data loss. EFSS also recently helped me to move a functional website from one server to another with permissions, owners, groups, and timestamps completely intact in a fraction of the time it would have taken me using other backup systems.

Thursday, July 31, 2014

An adventure in writing a PECL extension for PHP

Okay this isn't so much a guide on how to write a PECL extension as it is to discuss my recent experience in writing a PHP extension, publishing it, and documenting it. My hope is that by reading through the struggles I went through, that others can benefit.

Writing an extension for PHP requires serious skills and patience. Extension writing is a complex macro dance and really requires good planning to pull off successfully. In my case, I wanted to introduce native named synchronization objects into PHP. This is something I've felt has been missing from the language for far too long and I didn't see anyone else doing work in the area. The first thing I did was write a cross-platform library in C++:

Cross-platform C++ library

Writing a C/C++ library as a proof-of-concept is a good start to writing a PHP extension. I highly recommend it. Developing a separate library allowed me to work out most of the kinks in the logic apart from the Zend engine. After I solidified the working model, I began work on the extension itself. I had to port the library from C++ to C, but that was a fairly trivial operation.

The other major thing I did in advance of writing the extension was occasionally dig deep into the PHP source tree to figure out how functions actually worked behind the scenes. Some functions are so hopelessly complex (e.g. file handling because of URL wrapper support) that they are simply too dense to understand. Other functions, however, make for great snippets to commit to memory. The prerequisite time with the PHP source code before writing an extension is, IMO, about 3 to 4 months of casual interaction. That is just long enough to feel comfortable navigating the PHP source tree. If there was one thing I learned here is that every function in PHP is part of an extension, which may come as a surprise to many people.

The frustrating thing about PHP extension writing is that there is almost no documentation on how to go about doing it. There's a book (dead tree edition) by Sara Golemon on the topic and little else beyond a few minor, slightly dated blog posts. I ended up doing what most extension writers do - scouring the source code of other extensions to cannibalize specific ideas to write my own. "Simple" things in normal C such as returning a value become a Zend macro in PHP. Knowing which macro to use and what all the crazy options do is the hard part. Since I was making a set of object-oriented classes in PHP instead of functions, the amount of documentation on the topic approaches zero very quickly. So scouring other code based on the public documentation on php.net helps to figure out which macro is probably the right one. Having a good understanding of the source tree structure helped go a long way to figuring things out on my own. Ultimately, extension writing for PHP is a fairly dark art, which may explain why there aren't a ton of extensions out there. I got the distinct impression that the developers like it that way to require a minimum level of software development competence before work on an extension may begin.

That said, the PHP documentation on extension writing does a decent job of getting developers started. The 'ext_skel' script makes a mostly working skeleton for a new extension. Not bad. I think the main issue I ran into regularly was that ./buildconf has to be run with the --force option until the configuration file is finalized when using the release builds of the PHP source tree.

I highly recommend developing an extension on Linux first and then porting it to Windows after that. The compiling environment on Linux for PHP is far superior to the Windows build environment. However, if you are like me and prefer Windows text editors and IDEs (i.e. don't like Linux editing tools), do what I did which was to use WinSCP to act as the go-between and then I used my favorite Windows-based text editor to edit the source code. Doing that worked out pretty well for me. And since I had worked out all of the core issues with my extension in a separate library (which allowed me to fire up Visual Studio for real debugging), nearly all of the extension writing process was porting the code (easy) and writing plumbing to connect it to Zend (more difficult).

I also developed a small test suite to validate that the code was working as expected. I highly recommend making a small test suite as it helps catch bugs in the code.

Once the Linux version was done, I went and tested it on Windows. The Windows build system, as mentioned earlier, is more fragile. One wrong character in the wrong place and the whole Configure.js script will blow up without specifying why. Setting up the Windows build environment is also a bit more difficult as there are several distinct pieces that have to be in the right place. However, once everything was in place, it built just fine. Again, a small test suite can come in very handy for tracking down bugs.

At this point, the extension was developed but my adventure was only beginning. See, I wanted it to be a PECL extension. If I were simply satisfied with just having a PHP extension that anyone could compile into PHP, that would be the end of it. However, PECL sprinkles on some special magic that transforms an extension into something that people want to use because an extension is suddenly easy to install via "pecl install extensionname" and then PECL handles downloading, extracting, compiling, and installing the extension. Also, package maintainers for major OSes like Ubuntu will pick up the extension and make it easy to install with package management tools like 'apt-get'. There are huge visibility advantages to deploying an extension via PECL.

Releasing on PECL requires approval. The PHP devs are rather stringent about who they let in, so I knew I needed to make the new extension relatively awesome. I got on IRC and the PECL dev mailing list and started several discussions. This is a very important step as there are conformity issues that will crop up. It is important to be super-flexible and willing to make changes to the code. I ended up moving a GitHub repo around and revamping a lot of code during this process. Once everyone seemed to be cool with the work that had been done, it was time to apply for access to PECL, PHP, and documentation repos via the PECL signup form. It's a completely laid-back process - therefore, after applying, I recommend just finding another project to work on. It can take up to two months for the devs to get around to accepting new users and getting those users set up with the appropriate level of access.

The hardest part about developing a PECL package is figuring out what the "correct" way to develop an extension is. If writing a regular extension is somewhat of an obscure task, the process of releasing a PECL package is more so. This isn't really anyone's fault since there isn't a whole lot of need for new general-purpose extensions for the language to begin with. So hammering out a good guide is a bit of a low-priority when there are other, more pressing matters to attend to. Plus, I'd wager that it raises the bar to entry somewhat significantly. PHP is software used on millions of hosts, so it needs to have some semblance of quality control applied to it. Obscuring the process of writing and releasing an extension is a pretty good solution to that problem.

At any rate, once the PECL account is approved, a whirlwind of activity happens. In general, I had already generated a PECL package but I had to regenerate it after rewriting parts of my 'package.xml' file. I had also looked at what doing documentation would require. But by the time the approval actually happened, I had kind of forgotten what I had done, so I made sure the test suite still passed and the extension still built as a sanity check. In all, it only took about one weekend to do the actual release and documentation cycle. Again, looking at how other extensions do things helps a lot with creating a consistent experience.

Here's the PECL package:

http://pecl.php.net/package/sync

Here's the documentation:

http://us1.php.net/manual/en/book.sync.php

The original version 1.0.0 of the extension had a bug that only showed up on some hosts that the PHP dev team caught. I tracked it down and fixed it and sheepishly released the 1.0.1 version of the extension. Interestingly, the 1.0.1 version received an "automatic" Windows DLL. I didn't build it but I suspect the team was waiting for a fix for the bug before letting the system do the Windows build. Also, within five minutes of 1.0.1 being released there were 40 downloads according to the PECL stats page. I assume there are automated processes sitting on the announcements list looking for new package uploads to PECL - either that or crazy people.

The documentation writing bit was a different experience too. Good documentation includes code examples that cover real-world scenarios (i.e. not contrived). The PHP documentation is written in a giant set of XML files. The primary way to introduce documentation is via Subversion. The main way to adjust the documentation is via a custom web-based GUI that the team has come up with. At the time of this writing, there is a site that publishes the latest documentation every six hours and the main PHP website and mirrors are updated every Friday. The GUI is pretty neat in that it attempts to manage the translation teams and tracks which bits of documentation are no longer building (because XML is pretty fragile). Because I was introducing new documentation into the tree, it was far more efficient for me to use the Subversion route. I made sure that my changes built locally without issues before committing them back into the main repository. Because so many people are involved in PHP development, it is very important to tread carefully when committing anything (code or documentation) and try to avoid breakages.

In my opinion, the source code is simple enough to use as a tutorial extension that does something useful without being overly complicated for those interested in writing object-oriented extensions for PHP. I would study the C++ library first to understand that code. Then the similarities between it and the PECL package stick out and it becomes easier to understand the more obscure Zend bits.

Hopefully these tips help someone out with their extension writing efforts.

Thursday, May 29, 2014

FreedomPop "free" plan is a bit dishonest

I've recently been exploring the world of FreedomPop on behalf of a friend who is going through a really rough patch in his life. FreedomPop sells WiFi hotspot devices that supposedly get 500MB of data per month for free. The only thing to pay for is the device itself, which will set someone back about $50. It sounds awesome, but that's really all it is.

The old adage, "If it sounds too good to be true, it probably is" definitely applies here. This plan is marketed as 500MB of data per month for free "forever" (the 'forever' is implied). Even 3 years of use (i.e. until the lithium ion battery wears out) would be enough to help my friend out in a significant way. I was willing to play guinea pig for this interesting service because I also have some potential use for it.

So, I went and bought the device, which set me back a little over $50 (after tax). Then I waited. And waited. And waited. And waited some more. And pretty much forgot about it until it randomly showed up about a month and a half later. In addition, these devices are "refurbished", but who really cares about that as long as they work? At any rate, the extensive waiting is the first warning sign that something might be fishy here.

The device that arrives is a "Sprint (now Netgear) Overdrive Pro (3G/4G)" hotspot. The free plan claims to run only on 4G (you have to pay to get 3G), which is technically accurate. What FreedomPop fails to mention up front is that it only runs on 4G WiMAX and that the device has no support for 4G LTE. So you can be bathed in 4G LTE service all day long but the device will never connect to it. It's a tad misleading as users think they will be connecting to 4G regardless of the type of 4G service. Unless a user is intimately familiar with all of the various forms of 4G out there, it is unreasonable to expect them to understand the difference between 4G WiMAX, 4G LTE, and other 4G variants.

Additionally, the device has to be manually configured before it will function properly. This may be beyond the skill set of some users. The 3G PRL and 3G profile have to be updated via the admin before it will connect to 3G. Again, 3G isn't free but it will connect once the PRL and profile are updated. The first time I tried this, the device had fits and I had to perform a soft reset (hold the reset button for six seconds while it is powered on) and try again before it succeeded the second time. Also, firmware updates have to be applied via the admin before 4G WiMAX will function at optimal settings. That last part is tricky because it looks like a large batch of refurbished devices, including mine, were modified in a way that prevents updates from being applied to the device. It appears that someone intentionally changed the SKU of each device from SKU 1453010 to SKU 1453012. The device firmware checks the SKU and checksums of a new firmware before applying it. So multiple users are getting the message "The update cannot proceed. There is a SKU version mismatch." when they upload the latest firmware.

(It may(!) be possible to alter the SKU of the device via a configuration file import, but the importer appears to verify a checksum, so that creates a new problem since the configuration file can't simply be edited with a text editor. The "simple" solution to that problem is to find someone with a SKU 1453010 device and import their configuration, which should correct the problem and allow the firmware update to proceed. I am still working on this approach, so don't do anything here. I'm willing to brick my device at this point.)

Also, Sprint is terminating WiMAX service in 2015. Anyone on the free plan currently able to get 4G WiMAX service will suddenly have a paperweight unless they sign up and pay for 3G service. 3G still enjoys a wider adoption rate, but, as anyone who has used 3G knows, it is rather sluggish.

Basically this reads as:

Warehouse operator: "Oh man, we've got all these devices sitting around our warehouse and Sprint is going to make them basically useless in under a year. We need to move this inventory out right now."

Marketing director: "I know! We can just give users WiMAX for free but not tell them about it until they've received the device and try to use it. We'll just advertise it as a 'free service with 4G only' because people will love the idea of 'free 4G'. We'll get rid of the devices and make some money."

It's definitely a brilliant strategy for moving inventory that no one will want soon enough. In addition, the way it is being marketed also phrases it such that people can be misled to believing that they will also get 3G service for free, which they won't. It is a bit dishonest to do that to people. In particular, this plan is being advertised to people who are classified as "low income" as being a way to get free Internet access "everywhere" they go (the 'everywhere' is implied). If 4G WiMAX is readily available in the area, it might be a viable temporary solution for someone who has no Internet access. It is also potentially useful as a device for setting up a quick WiFi LAN between two WiFi enabled devices vs. messing around with ad-hoc networks. So it isn't really a scam, but the way it is marketed isn't completely honest either - being especially unfair to low income individuals and families who can't afford a $50 loss.

If it had worked out (i.e. 4G LTE capable), this plan would be a game changer in the industry. It would force every telecom to finally lower their rates to sane levels. If you pay more than $10/month for unlimited text, talk, and data, then you are being ripped off and are paying too much for service.

Friday, May 09, 2014

Is Firefox 29.0 "ugly"? Then try this...

I run Windows 7 Ultimate, full Aero effects, and Firefox. The most recent update of Firefox to version 29.0 resulted in yet another redesign of the tabs. This redesign is very unfortunate because it makes the text completely unreadable. I have some very choice words for the Firefox developers about their general competence, but that's not what this post is about.

This post is for the average user that this garbage release was foisted upon. I've been running the fix for about a week now and, while not perfect, it is much better than not being able to read the text on my tabs at all.

Go here:

https://addons.mozilla.org/en-US/firefox/addon/white-to-gray-gradient/

Click "Add to Firefox". Done.

That's the best theme I could find that balances readability, usability, and some semblance of elegance.

Saturday, May 03, 2014

Reinventing the office chair

My current office chair I use at home that I bought a decade ago for about $100 (it was on sale) is starting to fall apart. One would hope that innovation over a decade would result in improvements.

First off, the office chair you sit in probably looks like this:



If you are lucky to work for a really nice employer, you might get one of these:



Oooh...armrests!

You know what the ironic thing is? Neither of those chairs were designed to be sat in, yet they cost just as much as (if not more than) a decent chair. Those chairs exist due to some insane thought process that managers and executives get nicer chairs to sit in as a person moves up the corporate ladder. I'm sorry, but that's just cruel. If you are going to sit in a chair for 6-12 hours a day, then it had better be comfortable to sit in regardless of who you are. Sitting in the wrong chair for hours on end can and will result in regular headaches and/or migraines (I'm speaking from personal experience here).

So what constitutes a comfortable chair? Leather vs. cloth is usually the first thing people consider. I consider other things. For instance, when I wake up in the morning, I sit in my office chair and put my bare feet on the legs. If the legs of the chair were made of metal, I'd be really annoyed because metal tends to be colder than plastic. Fortunately, the legs of my current chair are made of plastic.

I also lean back in my chair and rest my head. I have what is known as a "high-back" office chair. I measured the back of my current chair as being 28" tall (starting from the inside). I can find fairly cheap chairs that come up to 27" tall. That one inch difference is night and day - I have to tilt my head back uncomfortably to reach the headrest on a 27" chair. To get a 28" back on a chair, I have to go to the "big and tall" section, which immediately adds $250 to the price tag (probably because of hydraulic systems that support heavy people, which I don't need). Unfortunately, the closest chair to my desired measurements that I can find has...metal legs...arg! My $100 decade-old chair beats a $350 chair that's made today. As you can imagine, this is incredibly frustrating AND a waste of my time. Time better spent developing software!

Alright, enough ranting. Onto my wonderfully innovative idea: The ability to craft your own modular office chair from compatible parts. I would love to be able to mix and match:

  • Seat
  • Back
  • Armrests
  • Hydraulic system
  • Legs
  • Rollers
I could buy each part individually and then put the whole thing together myself. I'd be able to put together a $175 office chair that meets all of my requirements in half an hour from a single shopping trip. This isn't rocket science and it is VERY silly that we don't have this yet.

Through my recent experience, I've come to the singular conclusion that one size does NOT fit all. We should all go to our local office supply stores and request that they start carrying modular office chair equipment.