Tag: LinkedIn

  • Review: Senario NRG MicroFly RC Hovering UFO

     

    Senario NRG MicroFly RC Hovering UFO

    In a previous post I talked about the Senario Alien Microfly a bit and in this post I will provide a full review. I gave a few units to some of my family for Christmas so I have flight reports from them as well.

    The Senario Alien Microfly kit comes with a transmitter and the Microfly itself. The transmitter takes 6 “AA” batteries and also serves as the charger for the Microfly. It is pretty small (see the pictures below) and a lot of fun to fly around the house or office.

    I put in many, many flights. Each flight is about 5 minutes with a 15-20 charge time. The cats gave it a few taste tests but mostly they like to just stalk it as it flies around the living room. 😉 I bought 5 of these for myself and my family for Christmas and all of them worked out of the box. Here is a list of pros/cons:

    Pros

    • It is simple to fly. There is only one control to make it go up or down so you don’t need a lot of experience.
    • Cheap. You will probably find it for $25 or less.
    • Durable. You can’t sit or step on it (it is mostly just foam board) but mine has been through many crashes and even survived a few taste tests by the cats.
    • The flight time is about 5 minutes which I think is pretty good for something so small.

    Cons

    • In very large rooms (gym/church/wharehouse) it can quickly get out of range if it doesn’t have walls for the IR signal to bounce off of.
    • After running through three or four sets of “AA” batteries (rechargable) flight times have fallen off quite a bit although I did get many flights on each set of batteries
    • Fragile. Although it can withstand being bounced off a few walls it is very small and made of foam so you don’t want to leave it someplace where it will be sat/stepped on.
    • No directional flight. It only goes up and down.
    • Charge time is kind of high… about 15-20 minutes per flight

    Conclusion

    Overall I give the Microfly 3 out of 5 stars. I would easily rate it higher if it maintained its power after extended use. I don’t know if the built in battery has just been recharged too many times or if the motor is reaching the end of it’s life since it is so tiny and spins at such high RPMs. Despite this, I would say it is easily worth the price and would recommend it to anyone that enjoys RC toys.

    Images

    Senario NRG MicroFly RC Hovering UFO

    Senario NRG MicroFly RC Hovering UFO

    Video

     

  • How to use the file_get_contents() function to make an HTTP request from PHP

    In a previous post I talked about using the HttpRequest object and functions in the PECL_HTTP extension to make HTTP requests from PHP. In some cases you may be limited to using functionality built into the PHP core. The file_get_contents() function has less features than the PECL_HTTP extension but it is built into PHP 4.3 and up. Here is an example of using it to retrieve the landing page at www.example.com:

    
    

    Someone hit that easy button.

    The file_get_contents() functions as well as many other PHP file functions implement a streams abstraction layer largely conceived by Mr. Wez Furlong. This abstraction layer is what enables many of the PHP file functions to access network resources. Given this functionality “file” seems a misnomer.

    The file_get_contents() function uses an HTTP GET request but what if you want to do a POST without using cURL or the PECL_HTTP extension? Furlong posted an article here on how to do just that.

    This next code example uses the file_get_contents() function again but this time a few options are set first using the stream_context_create() function:

     array(
            'user_agent' => "Mark's Browser",
            'max_redirects' => 3)));
    echo file_get_contents("http://www.example.com", false, $http_options);
    
    ?>
    

    Note that the array passed to the stream_context_create() function can also be used to specify a POST method, which is how Furlong does so in his blog post.

    There is still yet another way to make an HTTP request from PHP that I haven’t covered yet using the PHP built-in cURL functions. I will cover these in a separate blog post.

  • How to make your PHP application check for its dependencies

    The very informative phpinfo() function

    The phpinfo() function displays just about everything you want to know about your PHP installation. It includes info on all your PECL and PEAR modules so it is a quick way to check what’s installed. It will also tell you useful web server information including any query strings you pass. To display all this good info just point your browser at a page on your server that contains the following code:

    
    

    That’s it!

    Automatic dependency checking

    The phpinfo() function will give us a page that displays a lot of info. You can pass it bitwise constants to narrow down the information displayed but what if we want to check for specific items?

    In my humble opinion, when developing software or anything in general, it is a good idea to design things so that the end user will not even need a manual because the user interface is obvious. When something doesn’t work, it should be equally obvious how to fix it.

    If you write a PHP application that others will install and use, it is a good idea to check for dependencies when they try to use the application. This way even if they don’t read your documentation they will quickly know why the software is not working.

    Using phpversion(), PHP_VERSION, and version_compare() to check the PHP verson

    To get the core PHP version you can use either of the following methods:

    or
    "; echo PHP_VERSION; ?>

    The above code should output something like this:

    5.2.6-2ubuntu4
    or
    5.2.6-2ubuntu4

    If you are using Ubunto or some other distribution, you will note that some additional stuff is tack on to the version number (I.e. “-2ubuntu4”). This makes a comparison to your expected version a little tricky but you can use a substr()/strpos() combo to get what you need. There is an easier way to do the comparison though. The version_compare() function is “PHP-standardized” version aware. So we can do something like this:

    
    

    Now you can check the PHP version and notify the user if it is not the minimum required version.

    The PHP function documentation for each function at www.php.net include the PHP versions that contain the function in the upper left hand corner:

    substr function version on php.net

    You can use this to learn what versions of PHP include the functions you are using in your code to help identify your minimum PHP version requirement.

    Using get_loaded_extensions() to check for extensions

    The get_loaded_extensions() function will return an array of PHP extensions that you can use to check if a specific extension is installed. Use it in combination with with the in_array() function to check if the extension you require is loaded. In this example I check if the PECL_HTTP module is installed:

    
    

    You can use the phpversion() function to check if extension is listed and if so, its version. This code example not only checks if the PECL_HTTP module is installed, but also checks it’s version:

    here '.
          ' and install it.';
    } else {
        if (version_compare(phpversion('http'),'1.6.0','>=')){
            echo 'The PECL_HTTP extension is installed and '.
              'version 1.6.0 or higher. You are all set!';
        } else {
            echo 'Please upgrade your PECL_HTTP extension to '.
              'version 1.6.0 or higher. You can download it '.
              'here'.
              '.';
        }
    }
    
    ?>
    

    Use function_exists() to check for individual functions

    So far the methods for checking dependencies have been somewhat broad. They check that the script has a certain version of PHP or extensions installed and that will likely be good enough in most cases. If you really want to be thorough you can also check if specific functions are available using the function_exists() method. In this example I check that the http_request() module, which is part of the PECL_HTTP extension, is there before I use it. If it is not, I use the less featured, built in, file_get_contents() function.

    ' .
            http_parse_message(http_get("http://www.example.com"))->body;
    } else {
        echo 'Using the file_get_contents():
    ' . file_get_contents("http://www.example.com"); } ?>

    Check for include files

    Here is a simple way to check for include files. It doesn’t verify their content but you can at least make sure they are there:

    
    

    Wrap up

    Checking dependencies is an important part of building robust software and hopefully the above techniques will help accomplish that. Even if your end user is a very technical they will likely appreciate a good dependency checking mechanism that quickly tells them whats missing to save them time. If your software will be used by non-technical users you might want to automatically and gracefully downgrade your software feature set instead of generating errors and asking them for something they won’t know how to do. Usability is king!

  • 25 ways to insecurity

    The 2009 CWE/SANS Top 25 Most Dangerous Programming Errors was recently released by CWE/SANS.

    Most of the items are old news but I think it is a good checklist that should be on the boiler plate for web application design documents. By putting security requirements in the software specification and design documents, the project manager can then allocate time and resources to security aspects of development. In addition, it reminds developers to ask themselves if the software is meeting those requirements throughout the development process. This is opposed to thinking about security after the entire application has been written and discovering a fundamental design flaw that will require re-writing a good portion of the application.

    I particularly appreciate that each item on the CWE/SANS list is weighted including weakness prevalence, remediation cost, attack frequency, attacker awareness, etc. No project has an unlimited budget but you can prioritize on where to focus your resources to achieve the most secure solution. Generally it is a good idea to ensure that the cost of defeating an application’s security far outweighs any benefits to be gained from doing so. The cost of defeating an application might include labor time, computing resources, fines, and threat of jail time with a cell mate named Bubba, etc.

    It is quite a challenge to develop secure web applications because generally by their nature they need to accept user input. I believe that it is typically much more difficult develop a secure system than it is to break in to the system given the same number of hours so there is often more burden on the developer. It might take only two or three days to develop a working database driven web application but many additional weeks to harden it against attacks and make it reliable, scalable, and highly available. Including security requirements in the software specification and design is essential to planning and allocating resources.

    Ideally automated tests should be included to continuously test vulnerabilities throughout the life of an application. This way security vulnerabilities introduced by code changes will be detected early in the development process instead of later in production. Automated tests could attempt buffer overflows, sql injections, etc. and could be executed prior to a developer’s check-in or on a nightly cron job that automatically checks out the code and runs the tests against it. Although costly to implement initially, automated security testing will likely pay for itself many times over the course of an application’s life. I plan to talk more about automated testing in future posts.

  • PHP HttpRequest class options and notes

    In a recent post I talked about the PECL_HTTP extension for PHP. In this post I will cover a few of the options you can set for the HttpRequest object and related functions.

    The PECL_HTTP extension allows you to set a number of options when you make a request. Usually you put the options in a key=>value array and pass the array as an argument to the request functions (I.e. http_get(), http_post_data(), http_request(), etc.) or assign the array to the HttpRequest object using the setOptions() method. Here is a code example using the HttpRequest object:

    $http_req = new HttpRequest("http://www.example.com");
    $http_req->setOptions(array(timeout=>10,useragent=>"MyScript"));

    In the above code the timeout was set to 10 seconds and the user agent, which is a request header that identifies the browser to the server, is set to “MyScript”. I am going to cover just a few of the options but a full list of all the request options can be found here.

    timeout

    The timeout option specifies the maximum amount of time in seconds that the request may take to complete. Set this too high and your HTTPD process that PHP is running in could be stalled for quite a bit waiting for a request that may never complete. If you set it too low you might have problems with sites that are just slow to respond. This may require some tweaking depending on what you are doing. If you are making requests to a web server in Taiwan you might want to set the timeout a bit higher. The default timeout does not appear to be documented in the HttpRequest options page on PHP.NET (that I could find) but if you look at the http_request_api.c file in the HTTP_PECL source code, it looks like it is 0L AKA nothing:

    HTTP_CURL_OPT(CURLOPT_TIMEOUT, 0L);

    This indicates it will wait forever unless you explicitly set a timeout so it might be a good idea to set one! I put together a page that makes an HTTP request and then another one that will sleep for some number of seconds that I can test against.

    Here is the code for the page that will sleep:

    <?php

    echo "Sleeping.";
    sleep(30);

    ?>

    Here is the code for the page that will make the HTTP request:

    <?php
    $http_req = new HttpRequest("http://localhost/alongtime.php");
    $http_req->setOptions(array(timeout=>10, useragent=>"MyScript"));
    try {
        $http_req->send();
    } catch (HttpException $ex) {
        if (isset($ex->innerException)){
            echo $ex->innerException->getMessage();
            exit;
        } else {
            echo $ex;
            exit;
        }
    } //end try block
    echo $http_req->getResponseBody();
    ?>
    

    When I pull up the page that makes the HTTP request in my browser I get the following error:

    Timeout was reached; Operation timed out after 10000 milliseconds with 0 bytes received (http://localhost/alongtime.php)

    If I don’t set the timeout option at all, the page responds 30 seconds later since it will wait forever or at least the 30 second sleep time on the target page.

    connecttimeout

    The connecttimeout option indicates the maximum amount of time in seconds that the request may take just connecting to the server. This does not include the time it takes for the server to process and return the data for the request. This option will have the same considerations as above although the number should be considerably lower since it is only the connection timeout and not the timeout for the whole request. Again, the default value is not documented but if you look at the http_request_api.c file in the HTTP_PECL source code, it looks like it is 3 seconds:

    HTTP_CURL_OPT(CURLOPT_CONNECTTIMEOUT, 3);

    dns_cache_timeout

    One of the interesting features of the HTTP_PECL extension is that it will cache DNS lookups. Some of the Windows operating systems do this but many of the Linux distributions do not by default. By the way, if you want to clear your cached DNS lookup entries on a Windows box use the command “ipconfig /flushdns”. If you are making multiple requests to the same site, DNS lookup caching should provide a significant performance advantage because a round trip to the DNS server isn’t required for every request. The dns_cache_timeout option sets the number of seconds that will pass before the cached DNS lookup results will expire and a new DNS lookup will be performed. Again, the default value is not documented but if you look at the http_request_api.c file in the HTTP_PECL source code, it looks like it is 60 seconds which is probably fine for most applications:

    HTTP_CURL_OPT(CURLOPT_DNS_CACHE_TIMEOUT, 60L);

    redirect

    The redirect option determines how many redirects the request will follow before it returns with a response. The default is 0 (this IS documented), which may not work in many situations because some applications respond with one or two redirects for authentication, etc. If you set this too high your application may get bounced around too many times and never return. I have not tried it but you could probably put someone in a redirect loop. Anyway, a value of around 4 or 5 should be adequate for most applications I would imagine.

    useragent

    The useragent option allows you to specify a different User-Agent request header to send to the server than the default which is “PECL::HTTP/x.y.z (PHP/x.y.z)” where x.y.z are the versions.

    I made a little one-liner test page that returns the user agent info sent to the server:

    <?php

    echo $_SERVER[‘HTTP_USER_AGENT’];

    ?>

    If I make an HTTP request to this page using the HttpRequest object without setting the useragent I get:

    PECL::HTTP/1.6.2 (PHP/5.2.6-2ubuntu4)

    If I do something like this:

    $http_req->setOptions(array(timeout=>10, useragent=>"Mark’s Browser"));

    I will get:

    Mark’s Browser

    The reason I bring this up is because some applications that you might make a request to may respond different depending on your user agent setting. In some cases you may need to spoof a specific browser to get what you are after.

    Conclusion

    As mentioned before, there are many more HttpRequest options. I just covered a few notable ones that I have some limited experience with.

  • Review: Lego 6211 Star Wars Imperial Star Destroyer

    Jen gave me the Lego 6211 Star Wars Imperial Star Destroyer for Christmas and I finished building it over the following week.

    The build is pretty easy (even for a 35 year old) and a lot fun. The kit comes in over 1300 parts so you really feel like your building something and not just putting a couple halves together. The parts come in numbered bags that correspond to the numbered sections in the instructions. There are often multiple bags with the same number for a single section. Lego included a few extras of the tiniest parts that tend to get lost in the carpet.

    There are only a couple downsides to the kit. The two “laser blasters” that launch projectiles work by you just quickly depressing the launch button that drives a wedge piece behind the projectile to push it out. This is kind of silly to me and I think the kit would have been better if they were just static elements in my opinion.

    The other negative was the capsule that goes inside. It is assembled by connecting two specially molded pieces together. I don’t like this kind of Lego construction and would have preferred it if they just left it out or designed it to build from smaller, standard Lego pieces.

    I put together a list of pros and cons:

    Pros

    • Over 1300 parts so it will keep you building for a bit.
    • Numbered bags.
    • Spare tiny pieces.
    • Looks great!
    • It’s big. About 23 inches long.

    Cons

    • The laser blasters that shoot projectiles are kind of cheesy.
    • The capsule that goes inside is mostly just two halves that go together.
    • Doesn’t shoot real bolts of green light. 😉

    Build gallery

    I took some pictures while my Imperial Star Destroyer was under construction:

    Embellishment

    I felt the kit would have been better if they included lights and a small universe. Fortunately, I was able to accomplish this with a small string of Christmas lights, a telescope photo of the stars, black velvet, and a little Photoshop magic:

    Conclusion

    I easily give this kit 5 out of 5 stars. It was fun to build and looks great.

  • How to: PECL HTTP request exception and error handling

    In a previous post, we created a simple PHP page that told us if http://www.example.com is up or down by using the PECL HTTP extension to make an HTTP request to the site and look for the string “example” in the response. Here is the code for our test page, httptest.php:

    <?php
    
    $http_req = new HttpRequest("http://www.example.com");
    $http_req->setOptions(array(timeout=>10,useragent=>"MyScript"));
    $http_req->send();
    
    if (stripos($http_req->getResponseBody(), "example") === false){
        echo "The page is down!";
    } else {
        echo "The page is up!";
    }
    
    ?>
    

    The problem with this code is that there is no error handling. Below are a few examples of what can go wrong and the resulting errors:

    DNS Lookup Failure

    If the DNS lookup fails the page will return the following error:

    Fatal error: Uncaught exception 'HttpInvalidParamException' with message 'Empty or too short HTTP message: ''' in /var/www/httptest.php:12 inner exception 'HttpRequestException' with message 'Couldn’t resolve host name; Couldn’t resolve host 'www.somewebsitethatdoesnotexist.com'
    (http://www.somewebsitethatdoesnotexist.com/)' in /var/www/httptest.php:4 Stack trace: #0 /var/www/httptest.php(12): HttpRequest->send() #1 {main} thrown in /var/www/httptest.php on line 12

    Since www.example.com is a valid DNS name I used “www.somewebsitethatdoesnotexist.com” instead to demonstrate what happens with an invalid name or failed DNS lookup. Note the “inner exception” that says “Couldn’t resolve host name”. More on “inner exceptions” in a bit. This is not very pretty for a diagnostic page.

    Connection Failure

    In this example I again used “www.somewebsitethatdoesnotexist.com” but I added the following entry to the /etc/hosts file on the server:

    10.10.10.10 www.somewebsitethatdoesnotexist.com

    Now the DNS entry will resolve using the /etc/hosts file but this is not a valid IP for any machine on my neetwork so I see this error:

    Fatal error: Uncaught exception ‘HttpInvalidParamException’ with message ‘Empty or too short HTTP message: ''' in /var/www/httptest.php:12 inner exception ‘HttpRequestException’ with message ‘Timeout was reached; connect() timed out! (http://www.somewebsitethatdoesnotexist.com/)’ in /var/www/httptest.php:4 Stack trace: #0 /var/www/httptest.php(12): HttpRequest->send() #1 {main} thrown in /var/www/httptest.php on line 12

    Again we have a inner exception buried in all of that telling me that the connection time out.

    404 Error

    In this example I put in “http://localhost/notarealpage.php” for the URL. This will connect to the local Apache server but that page doesn’t exist so the server will return a 404 file not found error. The server responded but since we are not checking the response code from the server our code just tells us the page is down and that is true but it would be useful to know that it is because the page is missing!

    The page is down!

    If the server responds OK we will get a 200 status code. We should handle any other response appropriately.

    Handle the exceptions

    The first thing we can do is put a try catch block around our code and try catching the HttpException as shown in example section of the documentation for the HttpRequest::send method:

    <?php
    
    $http_req = new HttpRequest("http://www.example.com");
    $http_req->setOptions(array(timeout=>10,useragent=>"MyScript"));
    
    try {
        $http_req->send();
    } catch (HttpException $ex) {
        echo $ex->getMessage();
    }
    
    if (stripos($http_req->getResponseBody(), "example") === false){
        echo "The page is down!";
    } else {
        echo "The page is up!";
    }
    
    ?>
    

    If there is a time out or connection failure the HttpException is caught and we see this:

    Empty or too short HTTP message: ''The page is down!

    Hmm… that is not very informative and the same error is displayed for both a name lookup failure and a connection timeout. We can also try changing:

    echo $ex->getMessage();
    to
    echo $ex;

    Now we get this:

    exception 'HttpInvalidParamException' with message 'Empty or too short HTTP message: ''' in /var/www/httptest.php:16 inner exception 'HttpRequestException' with message 'Couldn’t resolve host name; Couldn’t resolve host 'www.ssomewebsitethatdoesnotexist.com'
    (http://www.ssomewebsitethatdoesnotexist.com/)' in /var/www/httptest.php:5 Stack trace: #0 /var/www/httptest.php(16): HttpRequest->send() #1 {main}The page is down!

    That is painfully ugly but at least get the “Couldn’t resolve host name” message in there so we know what went wrong. Still, we can do better.

    In addition to putting a try-catch around the send() method you probably should surround all of your HttpRequest code with a try-catch that eventually catches “Exception” to be safe.

    The undocumented inner exception

    The HttpException object, which is not really documented all that well as of this writing, has something completely undocumented called an inner exception. The inner exception is a more detailed exception that is nested inside the HttpException object. We can check if an inner exception is set and if so, display that instead:

    <?php
    
    $http_req = new HttpRequest("http://www.example.com");
    $http_req->setOptions(array(timeout=>10,useragent=>"MyScript"));
    
    try {
        $http_req->send();
    } catch (HttpException $ex) {
        if (isset($ex->innerException)){
            echo $ex->innerException->getMessage();
            exit;
        } else {
            echo $ex;
            exit;
        }
    }
    
    if (stripos($http_req->getResponseBody(), "example") === false){
        echo "The page is down!";
    } else {
        echo "The page is up!";
    }
    
    ?>
    

    Now we get just the part we are interested in:

    Couldn’t resolve host name; Couldn’t resolve host 'www.ssomewebsitethatdoesnotexist.com'
    (http://www.ssomewebsitethatdoesnotexist.com/)

    If the lookup is OK but we get a connection timeout we now see this:

    Timeout was reached; connect() timed out! (http://www.somewebsitethatdoesnotexist.com/)

    If no inner exception is detected the HttpException is echoed.

    Check status codes

    Sometimes the server may be responding but we do not get a 200 status. This could because of a redirect, security error, missing page, or a 500 server error. The HttpRequest object provides a getResponseCode() method so we can check what the response was and handle it appropriately. If redirects are followed the last received response is used. For this example we will simply echo out some of the common status codes if we don’t get a 200:

    <?php
    
    $http_req = new HttpRequest("http://www.example.com/blah");
    $http_req->setOptions(array(timeout=>10,useragent=>"MyScript"));
    
    try {
        $http_req->send();
    } catch (HttpException $ex) {
        if (isset($ex->innerException)){
            echo $ex->innerException->getMessage();
            exit;
        } else {
            echo $ex;
            exit;
        }
    }
    
    $response_code = $http_req->getResponseCode();
    
    switch ($response_code){
        case 200:
        break;
        case 301:
        echo "301 Moved Permanently";
        exit;
        case 401:
        echo "401 Unauthorized";
        exit;
        case 404:
        echo "404 File Not Found";
        exit;
        case 500:
        echo "500 Internal Server Error";
        exit;
        default:
        echo "Did not receive a 200 response code from the server";
        exit;
    }
    
    if (stripos($http_req->getResponseBody(), "example") === false){
        echo "The page is down!";
    } else {
        echo "The page is up!";
    }
    
    ?>
    

    This handles a few of the more common response/status codes. To test the code we can put in a valid URL but a bogus page (I.e. http://www.example.com/blah) If everything works right we now see the following response from our diagnostic page:

    404 File Not Found

    Final Notes

    Our little diagnostic page can now handle most of the errors it will likely encounter when it attempts to test our example site, example.com. If we wanted to take this a bit further we could add a database back-end that maintains a list of multiple sites we would like to test. To take things a step further we could execute this PHP script from the command line via a cron job that runs every 5 minutes. We could then have the script send an e-mail when a site was down with the problem indicated in the message. If we wanted to maintain some up-times stats would could log the outages to a database and generate uptime/SLA reports daily, weekly, yearly, etc.

    In reality, I would just use something like IPSentry or Nagios to conserve effort for future generations but it was nice to think about. 😉 Happy coding!

  • How to: Find your php.ini file and enable PHP error reporting

    On some distributions PHP error reporting or display of errors is disabled by default as a security precaution. This is a good idea for production systems because errors may reveal useful information to undesirables. In a development environment on the other hand, it is generally useful to see your errors. 😉 If error display is disabled you may just see a blank browser window/empty page when you expect an error. To enable errors and error display, find the your php.ini file and make sure the following lines are set:

    ;Show all errors, except for notices and coding standards warnings
    error_reporting = E_ALL & ~E_NOTICE

    display_errors = On

    On Ubuntu you can find the php.ini file here:
    /etc/php5/apache2/php.ini

    On other distributions try:
    /etc/php.ini

    On Windows you might find it here:
    c:\windows\php.ini

    If you are running XAMPP it will be in the php folder off the XAMPP root.

    You will need to restart Apache (or IIS as the case may be) so your changes will take effect:

    On Ubuntu:

    sudo /etc/init.d/apache2 restart

    On other distributions you might try:

    sudo /etc/init.d/httpd restart
  • How to use the PECL HTTP (PECL_HTTP) Extension to make HTTP requests from PHP

    PECL HTTP is a feature-rich PHP extension that allows you to make HTTP and HTTPS (SSL) requests from your PHP code and handle the responses. If you are not familiar with PECL, it is a library of extensions to add functionality to PHP. It uses the same package and delivery system as PEAR.

    Many distributions do not install PECL_HTTP by default even if you install PHP. If you try to use one of the PECL_HTTP object or functions (I.e. http_get()) without the extension installed you will likely get something like this:

    Fatal error: Call to undefined function http_get()

    If this error comes up but you think you installed the PECL_HTTP package and it shows up in phpinfo(), then it is possible your PECL_HTTP install failed and did not get cleaned up so phpinfo() still sees it. This may happen if you didn’t install the cURL source library dependency first (see below).

    So let’s pretend we own the site http://www.example.com (See RFC 2606.) We want to build a PHP diagnostic page that will tell us that www.example.com is returning the string “example” somewhere in the page indicating if the page is up or down. First I need to install the PECL_HTTP extension. For details on how to install a PECL extension see my post “How to install a PHP PECL extension/module on Ubuntu“. For now I am going to assume that the php-pear and php5-dev packages have already been installed. These instructions are based on a Ubuntu install:

    • Install the libcurl3-openssl-dev package. The HTTP_PECL extension requires some of the cURL source libraries so we will have to install the cURL library package first:
      sudo apt-get install libcurl3-openssl-dev

      If you don’t install the cURL source library package you will likely see the following error when you attempt to install the PECL_HTTP extension:

      checking for curl/curl.h... not found
      configure: error: could not find curl/curl.h
      ERROR: ‘/tmp/pear/temp/pecl_http/configure --with-http-curl-requests --with-http-zlib-compression=1 --with-http-magic-mime=no --with-http-shared-deps’ failed
    • Install the HTTP_PECL module with the following command:
      sudo pecl install pecl_http

      The installer may ask you about some specific options but unless you really know what you want, you can probably just hit enter one or more times to accept all the defaults. If all goes well, the module should download, build, and install.

    • Once the install is complete, it will probably ask you to add a “extension=http.so” line to your php.ini file. Open up the php.ini file in your favorite text editor and add the line under the section labeled “Dynamic Extensions”. On Ubuntu the php.ini file seems to be located in the /etc/php5/apache2 folder:
      sudo nano /etc/php5/apache2/php.ini
    • Now that the php.ini file has been updated, Apache will need to be restarted so the new extension will be loaded:
      sudo /etc/init.d/apache2 restart

      That should restart Apache on Ubuntu but if that doesn’t work you can try:

      sudo /etc/init.d/httpd restart

    At this point hopefully the PECL_HTTP extension is installed so now we can create a PHP script that will make an HTTP request to http://www.example.com and display the results. For this example I will use the http_get() function. The first argument is a predefined constant of the request method type (GET, POST, etc.) and the second argument is a string containing the URL. I created a file named httptest.php (using “sudo nano /var/www/httptest.php” with the following code and put it in the /var/www folder (The default HTTP root on a Ubuntu server):

    <?php

    echo http_get("http://www.example.com");

    ?>

    or you could use the http_request function instead to do the same thing:

    <?php

    echo http_request(HTTP_METH_GET,"http://www.example.com");

    ?>

    When the page is opened in a web browser (I.e. http://sparky/httptest.php) it returns something like this:

    HTTP/1.1 200 OK Date: Sun, 04 Jan 2009 22:41:54 GMT Server: Apache/2.2.3 (CentOS) Last-Modified: Tue, 15 Nov 2005 13:24:10 GMT ETag: "b80f4-1b6-80bfd280" Accept-Ranges: bytes Content-Length: 438 Connection: close Content-Type: text/html; charset=UTF-8

    You have reached this web page by typing "example.com", "example.net", or "example.org" into your web browser.

    These domain names are reserved for use in documentation and are not available for registration. See RFC 2606, Section 3.

    That’s it. Those are some pretty quick one-liners if we are fine with the default options. This time we’ll do something similar but use the HttpRequest object instead and set a timeout and a different user agent:

    <?php

    $http_req = new HttpRequest("http://www.example.com");
    $http_req->setOptions(array(timeout=>10,useragent=>"MyScript"));
    $http_req->send();
    echo $http_req->getRawResponseMessage();

    ?>

    The output is the same as the previous two commands but this time the server could have taken up to ten seconds to respond before the request timed out. In addition, we sent the user agent string “MyScript” in the host header to the server. If you don’t want the HTTP response headers included in the the output, you use the getResponseBody() method instead:

    <?php

    $http_req = new HttpRequest("http://www.example.com");
    $http_req->setOptions(array(timeout=>10,useragent=>"MyScript"));
    $http_req->send();
    echo $http_req->getResponseBody();

    ?>

    This outputs:

    You have reached this web page by typing "example.com", "example.net", or "example.org" into your web browser.

    These domain names are reserved for use in documentation and are not available for registration. See RFC 2606, Section 3.

    No response headers this time. In fact, it looks as though you typed http://www.example.com in the browser.

    I could set some URL query parameters using the setQueryData() if example.com was a dynamic page that accepts arguments but I am pretty sure it does not.

    For the purpose of our example it doesn’t look like we have gotten very far but PHP is now getting a hold of the response data before we see it so we are halfway there. Now all we need to do is search for the string “example” and return some type of indicator letting us know that the example.com page is up or down:

    <?php
    
    $http_req = new HttpRequest("http://www.example.com");
    $http_req->setOptions(array(timeout=>10,useragent=>"MyScript"));
    $http_req->send();
    
    /*Note: The stripos() function just returns false if it doesn't find an upper or lower case version of the string we are looking for.*/
    if (!stripos($http_req->getResponseBody(), "example")){
        echo "The page is down!";
    } else {
        echo "The page is up!";
    }
    
    ?>
    

    If everything is working correctly we will see:

    The page is up!

    If example.com is broken or doesn’t return the string “example” our test page returns:

    The page is down!

    This is great and all but you may have noticed there is no error handling to speak of which isn’t good. I will talk about HTTP request/PECL_HTTP error handling in a separate post but until then, happy HTTPing!

  • How to install a PHP PECL extension/module on Ubuntu

    PHP PECL extensions provide additional functionality over the base PHP install. You can browse the PHP PECL extensions available at the PECL repository here. The following steps show how to install a PECL extension/module on Ubuntu using the PECL_HTTP extension as an example and assumes that you already have Apache 2 and PHP 5 installed:

    • First, you will need to install PEAR via apt-get to get the necessary package and distribution system that both PEAR and PECL use. From a shell prompt enter:
      sudo apt-get install php-pear

      You will be prompted to confirm the install. Just press “y” and enter. If all goes well you should see it download and install the php-pear package.

      Note: “sudo” is used to provide the super user privileges necessary for the following command. So in this case the command “apt-get install php-pear” is being executed with super user privileges by preceding it with “sudo”. Unless configured otherwise, you will normally be prompted to enter a password when you use sudo. This is usually the same password that you logged in with.
    • Now you will need to install the php5-dev package to get the necessary PHP5 source files to compile additional modules. Enter the following from a shell prompt:
      sudo apt-get install php5-dev

      If you do not install the php5-dev package and try to install a PECL extension using “pear install”, you will get the following error:

      sh: phpize: not found
      ERROR: `phpize’ failed
    • The PECL_HTTP extension requires an additional dependency package to be installed. You can probably skip this for other extensions:
      sudo apt-get install libcurl3-openssl-dev
    • Now we are finally ready to actually install the extension. From a shell prompt enter following but substitute “pecl_http” with the PECL extension name you are installing:
      sudo pecl install pecl_http

      The installer may ask you about some specific options for the extension you are installing. You can probably just hit enter one or more times to accept all the defaults unless you want to set specific options for your implementation. If all goes well, the module should download, build, and install.

    • Once the install is complete, it will probably ask you to add a “extension=” line to your php.ini file. Open up the php.ini file in your favorite text editor and add the line under the section labeled “Dynamic Extensions”. On Ubuntu the php.ini file seems to be located in the /etc/php5/apache2 folder:
      sudo nano /etc/php5/apache2/php.ini

      In this example, the pecl_http extension install asked me to add “extension=http.so”.

    • Now that the php.ini file has been updated, Apache will need to be restarted so the new extension will be loaded:
      sudo /etc/init.d/apache2 restart

      That should restart Apache on Ubuntu but if that doesn’t work you can try:

      sudo /etc/init.d/httpd restart

    If all went well your PECL extension should now be working. You might want to write a PHP test page that will test the basic functionality of the extension to make sure everything is working OK. You can use this later to check that all your required extensions are installed and working when you deploy to a new server. In this example where I installed the PECL_HTTP module, I might write a PHP page that uses the extension’s HttpRequest class to go get a page and return the results.

    That’s it. For the next extension install you can skip the steps to install the php-pear and php5-dev packages.

  • Book Review: Smart and Gets Things Done: Joel Spolsky’s Concise Guide to Finding the Best Technical Talent

    3 out of 5 stars

    In my previous post I reviewed Joel Spolsky’s Joel on Software:… (I will spare you the full title). In this review I will be talking about one of the followup books to that, Smart and Gets Things Done: Joel Spolsky’s Concise Guide to Finding the Best Technical Talent. There are also a couple other Joel books I have not read yet that are worth noting:

    A lot of the content in Smart and Gets Things Done seems to overlap with Joel’s other books. I understand that most of the books are just a rehash of his blog but I guess I was a little disappointed that there was duplication of content between his books. For about $12 however, this book is still a pretty good value. Particularly if you haven’t already have his other books.

    In this book, Joel explains how to get the best programmers but it seems to lean more towards hiring the best college grads. Joel argues that the best programmer’s are so highly productive it is worth the extra pay and effort to bring them in as opposed to a mediocre programmer. I agree with this to some degree. I am not sure I totally agree with some of his techniques for evaluating who the best programmers are though.

    Joel is an advocate of spending some extra dollars on perks for his interviewees and employees. One example is that he has a uniformed limo service pickup interviewees at the airport. At first this sounds ludicrous but the more I thought about it, the more it started to make sense. If you have gone through the labor of filtering all those resumes and conducting all those phone interviews to find the best candidates to interview in person then perhaps it is worth it to give them the treatment to sell the job. I checked and uniformed limo service from JFK to Manhattan runs less than $150. Considering a typical NYC IT salary, that is a drop in the bucket if it will help land a top notch programmer.

    The book mentions purchasing a $900 chair for employees. I am not sure I could sit in a $900 chair without feeling a like a total snob but a $300 one seems to make sense. A comfortable programmer is likely a productive one. Nothing says you don’t care about your employees like an old, broken, stained, $100 office chair. The other thing the book mentions is giving employees dual monitors and top end computers. I think this is good advise and it probably isn’t expensive as you might think. For example, say you spend less than a $1000 to buy a new office chair, second monitor, dual head video card, and an extra 2GB stick of RAM for each employee and you expect those extras to last three years. That is about $27 a month per employee. I think that is a small price to pay for happy employees that feel valued. If that chair is super comfy you might even make up the cost in overtime work because they won’t be in such a hurry to get out of the chair at the end of the day. 😉

    There are a few things in the book that I disagree with and this might be just because I don’t have enough experience to know better yet. The first is that the book says incentive pay doesn’t work. I disagree. I already talked about this in my review of Joel on Software and I won’t delve into it further here.

    Another item in the book I don’t agree with is the concept that you don’t need an idea to build a software company. I suppose you don’t but it probably helps! I don’t have an MBA but I think that it is good business practice to identify a discrepancy or problem and build a product to fulfill it. Good programmers are great and all but I don’t think “best working conditions” + “best programmers” + “best software” always equals profit. You could build the best software but it won’t be profitable if the market is too small (you need the best sales people to pull that off). Perhaps I just misunderstood the first part of chapter 1.

    One of the suggested interview questions to help separate the wheat from the chaff if is a pointer recursion question. I think it is more important to evaluate the skills that the interviewee claims to have. If they put in their resume that they have experience in a specific language then ask them to write something in that language. An outstanding web developer may never have touched pointers before because they simply never had to. Yes, there are leaky abstraction cases but typically those result in looking something up on Google rather than re-writing a module in C. Also, just because someone understands recursion and pointers doesn’t mean that they will be the best programmer for the job. They might have no understand of object oriented languages because all they have used is a procedural language such as C although admittedly that is less likely in this day and age.

    The book goes on to say that pointers in C is more of an aptitude than a skill. Over the last year I have had a crash course in C/C++ and based on my own experience I argue that it is not that only a few people have an aptitude for pointers. The problem is that pointers are often poorly explained in many references and the syntax is a bit deceiving because the * symbol has a completely different meaning depending on if you are declaring a pointer or using it (“this is a pointer” and then “dereference this pointer”). If 90% of a computer science class isn’t getting something (as mentioned in the book) then I would say the professor should consider a different instruction strategy. Fortunately I had a pretty good instructor.

    Despite some of my misgivings I think this book is worth the money especially if you don’t have any of the Joel on Software books already. There are many helpful tips including where to go looking for candidates, how to help employees feel at home in your organization, and how to turn around an existing team that is on the rocks.

  • Book Review: Joel on Software…

    4 out of 5 stars

    The full book title is actually Joel on Software: And on Diverse and Occasionally Related Matters That Will Prove of Interest to Software Developers, Designers, and Managers, and to Those Who, Whether by Good Fortune or Ill Luck, Work with Them in Some Capacity.

    I felt the title was a bit too long for my blog post title so please excuse abbreviated version. 😉 Other than the title and a few other points which I will cover shortly, I think it is a very good book written by someone who obviously has years of experience in the software industry. The author is Joel Spolsky and the content mostly consists of a series of short essays from his blog at http://www.joelonsoftware.com. Although you can read most of the content on his blog I think it is worth owning the printed book.

    There are some technical sections in the book but it mostly focuses on software development and management in general. This is the kind of advise you would get from an experienced mentor. Of particular interest is the Joel Test. This is a list of 12 yes or no questions regarding things your software team should be doing to produce better code. The more items for which you can answer yes, the higher your team’s score. In my experience, such as it is, this is a pretty good test but there are a couple things I feel that are missing in the case of web application development. More on this shortly.

    Joel talks about five worlds of software development:

    1. Shrinkwrap
    2. Internal
    3. Embedded
    4. Games
    5. Throwaway

    I never really thought about it this way but it makes a lot of sense to do so. Depending on the type of development you are doing you will have wildly different priorities. As an example, Joel points out that embedded software often can never be upgraded so quality requirements are much higher than throwaway software for example, that will only be used once to message an input file. An awareness of the business model you are developing for should help sort your priorities.

    There were a few things in the book I disagreed with. One example is that Joel believes that incentive pay is harmful. I think that managed correctly, it is not. One way to handle this is to simply reward employees privately for going above and beyond. In other words, don’t tell employees you are going to reward them for doing X or you will run into the kind of problems that Joel describes. Instead, just thank them with some reward after the fact and explain that this won’t always be the case so there is no expectation or set formula for them to work around. This will make employees feel more appreciated for going the extra mile and they will likely continue to do so even if there are no guarantee they will be rewarded again. This is kind of a pay it forward incentive.

    I think employee ownership in the company is another good incentive. An individual employee may not have much control over the stock price of the company but it does provide the employee some justification to themselves as to why they should go above and beyond. After all, if you are hiring smart people, they will be smart enough to ask themselves why they are putting in that extra overtime to improve the quality of your product. Unless you are a managing a company that feeds starving children in Africa, smart employees will feel better about working overtime on a weekend if they are benefiting in some remote way and not just making some rich stock holders or owners even richer while their own salary remains flat.

    There are a couple items that I think are missing from the Joel Test that are important for the web application world:

    • Is high availability, reliability, scalability, and performance part of your spec?
    • Do you load test?
    • Do you write automated unit and integration tests?

    High Availability, Reliability, Scalability, and Performance

    I have a little over a year of web application development experience but I have been supporting web applications for well over ten years now and if nothing else I have learned that you need to plan for availability, reliability, scalability and performance:

    • High availability ensures that your web application will be available when your users attempt to access your web application.
    • Reliability specifies how often your web application will work correctly. Just because your application is highly available doesn’t necessarily mean it always works right.
    • Scalability planning ensures that if your site is suddenly Slashdotted you can quickly add capacity or if your user base grows you can shard accounts across multiple database servers. A common bottleneck in web applications is data writes and you can get to a point where no single server will have enough write throughput for a high volume web site. Since this is a bottleneck on the database side, adding application servers is useless. A spec and/or design document needs to include a plan for eventually distributing data across multiple database servers for DB write intensive applications. It is not unheard of for a dot com to re-engineer a good portion of their software base in a big hurry to handle growth. Rushed software updates will likely create disgruntled employees and a poor user experience.
    • Performance indicates how quickly a web application will respond to a request. This should include internet lag time and take in consideration where users will be accessing the site from. If you are going to have a large user base in Australia for a example, it might be a good idea to consider implementing a content delivery network with a point of presence in Australia.

    I think it is important to include very specific numbers for each of these items in the spec because each will strongly determine the level of effort and ultimately cost of a project. As you move closer to 100% availability and reliability, costs will likely go up exponentially for both development time and/or hardware. Scalability planning needs be in the spec so design time can be allocated for it on the project plan. Each incremental improvement will likely require more development time and/or hardware.

    Load Testing

    Load testing is critical to assessing performance bottlenecks in web applications. Your application’s performance may be spectacular with one user but what about 5000 users generating 500 requests a second to your database driven web application? It is good to know before it happens so you can plan accordingly and verify you are actually bottlenecking on over utilized hardware and not a “false bottleneck”. I define a false bottleneck as a situation where none of your server hardware is fully utilized and yet you can’t get any more throughput. This can be caused by a number of things such as a propagation delay, uncompressed images and JavaScript using up all the network bandwidth, or even something like a sleep(2) that someone put in the code for debugging and forgot to remove.

    I believe optimizing your code without load test data can be a time sink. If you are optimizing your code without any data you might spot a Shlemiel the Painter’s Algorithm and then spend six hours fixing it resulting in a few micro seconds of execution time saved. That sounds great but if you had done some load testing and monitored your stats, you might have noticed that table scan on your database server costing you over half a second each transaction that could fixed in 10 minutes with a single create/alter index statement. Load testing helps show your actual bottlenecks versus your perceived ones.

    Automated Unit and Integration Tests

    Automated testing is essential for the same reasons as a daily build… to ensure that no one has broken anything. You can use automated unit and integration testing to do a full regression test of your software every day or even prior to every check-in. Daily builds will just identify if anyone has broken the build while automated testing will tell you if everything stills works like it should.

    Unit tests focus on individual functions or units of code while integration tests verify that your application’s components are all working together per the requirements. A unit test, for example, will tell you if your circleArea() function is returning the correct value for given inputs. An integration test will tell you if data submitted via a form is being stored in the database correctly. Unit tests are good at helping you identify broken units of code at a more granular level while integration tests evaluate the overall functionality of your system as a whole. There are unit testing frameworks to facilitate authoring unit tests for all popular programming languages. In many cases unit testing frameworks can also be effectively implemented to do integration testing as well.

    There is some debate over the line between unit testing and integration testing but I honestly don’t care too much. The ideal goal is that you can execute a single command to do a full regression test of your entire solution. This initially increases your development time but will more than make up for itself over long run if your software is going to be around for a while and have many revisions. I have seen a few projects where fixing one bug introduces another or adding a new feature breaks an existing feature. Automated testing will bring these cases to light quickly before you deploy or get too far down the development cycle.

    Conclusion

    Despite a few things that I disagreed with or omissions I give this book a 4 out of 5 stars. Joel obviously has a lot of experience and I learned a lot. Even if you are an experienced developer you might find some valuable insights from this book.