Sunday, July 26, 2015

Speeding up database calls with PDO and iterators

Image source:
When you review lots of code, you often wonder why things were written the way they were. Especially when making expensive calls to a database, I still see things that could and should be improved.

No framework development

When working with a framework, mostly these database calls are optimized for the developer and abstract the complex logic to improve and optimize the retrieval and usage of data. But then developers need to build something without a framework and end up using the basics of PHP in a sub-optimal way.

$pdo = new \PDO(

$sql = 'SELECT * FROM `gen_contact` ORDER BY `contact_modified` DESC';

$stmt = $pdo->prepare($sql);
$data = $stmt->fetchAll(\PDO::FETCH_OBJ);

echo 'Getting the contacts that changed the last 3 months' . PHP_EOL;
foreach ($data as $row) {
    $dt = new \DateTime('2015-04-01 00:00:00');
    if ($dt->format('Y-m-d') . '00:00:00' < $row->contact_modified) {
        echo sprintf(
            '%s (%s)| modified %s',
        ) . PHP_EOL;
Above example code is a most common way to retrieve data. At first sight, this code is clean and looks good, but looking more closely you will discover a couple of points to improve.

  • Above code is not reusable, so whenever you need a similar functionality you're stuck with duplicating existing code.
  • Even though you're fetching an object with $stmt->fetchAll(\PDO::FETCH_OBJ); you still face the issue you're using an array of objects which will consume too much memory when fetching lots of data.
  • Filtering is done within the routine, which also means that if you have other filtering conditions you need to modify existing logic, making it hard for maintenance and expanding functionality.


Most of the modern frameworks are using Iterators for their data retrieval, because they're fast and reusable. But also they allow other Iterators to filter and modify the retrieved results. Building an application without a framework still gives you the option to use Iterators as they're part of PHP since Version 5.0.0 Beta 2.

So lets assume you continue to use PDO for your data retrieval, we can choose between two options:
  1. Use PDOStatement::fetchAll() to retrieve all data in a single go
  2. Use PDOSTatement::fetch() to retrieve a single row per iteration
Even though the first option seems really tempting, I prefer to use option two as it allows me to create a single Iterator to do the retrieval for me without limiting myself to options required to make the query (and thus making it reusable for any kind of retrievals).


 * Class DbRowIterator
 * File: Iterator/DbRowIterator.php
class DbRowIterator implements Iterator
    /** @var \PDOStatement $pdoStatement The PDO Statement to execute */
    protected $pdoStatement;
    /** @var int $key The cursor pointer */
    protected $key;
    /** @var  bool|\stdClass The resultset for a single row */
    protected $result;
    /** @var  bool $valid Flag indicating there's a valid resource or not */
    protected $valid;

    public function __construct(\PDOStatement $PDOStatement)
        $this->pdoStatement = $PDOStatement;

     * @inheritDoc
    public function current()
        return $this->result;

     * @inheritDoc
    public function next()
        $this->result = $this->pdoStatement->fetch(
        if (false === $this->result) {
            $this->valid = false;
            return null;

     * @inheritDoc
    public function key()
        return $this->key;

     * @inheritDoc
    public function valid()
        return $this->valid;

     * @inheritDoc
    public function rewind()
        $this->key = 0;
Above Iterator is just implementing the PHP Iterator interface, but in our example this is more than enough to achieve our goal.

As you can see, we implement the logic for data retrieval in the "next" loop, as this is our forward retrieval sequence. Take note of the second and third argument of PDOSTatement::fetch() statement: with the second argument we can control the cursor in our data retrieval, the third argument is to position the cursor for this data retrieval which was set scrollable outside the Iterator.

class LastPeriodIterator extends FilterIterator
    protected $period;

    public function __construct(\Iterator $iterator, $period = 'last week')
        $this->period = $period;
    public function accept()
        if (!$this->getInnerIterator()->valid()) {
            return false;
        $row = $this->getInnerIterator()->current();
        $dt = new \DateTime($this->period);
        if ($dt->format('Y-m-d') . '00:00:00' < $row->contact_modified) {
            return true;
        return false;
For filtering our data, we can now extend the SPL FilterIterator that will allow us to attach our filtering immediately to our DbRowIterator, making it extendable and reusable immediately.

Changing our initial data retrieval code into code that will use both of our Iterators is now very simple:

$pdo = new \PDO(

$sql = 'SELECT * FROM `gen_contact` ORDER BY `contact_modified` DESC';
$stmt = $pdo->prepare($sql, [\PDO::ATTR_CURSOR => \PDO::CURSOR_SCROLL]);

$data = new DbRowIterator($stmt);
echo 'Getting the contacts that changed the last 3 months' . PHP_EOL;
$lastPeriod = new LastPeriodIterator($data, '2015-04-01 00:00:00');
foreach ($lastPeriod as $row) {
    echo sprintf(
        '%s (%s)| modified %s',
    ) . PHP_EOL;
Please pay attention to $pdo->prepare($sql, [\PDO::ATTR_CURSOR => \PDO::CURSOR_SCROLL]); as we now need to ensure the cursor of dataretrieval is now scrollable so we can use row by row control.


I know that all this requires a bit of "extra" work and you might wonder why you should invest this "more work" as the foreach-loop was working as well. Let me show you with a benchmark between the two:

Foreach loop

  • Data fetching time for 63992 of 250000 records: 2.14 seconds
  • Data processing time for 63992 of 250000 records: 7.11 seconds
  • Total time for 63992 of 250000 records: 9.25 seconds
  • Memory consumption for 63992 of 250000 records: 217.75MB

Iterator loop

  • Data fetching time for 63992 of 250000 records: 0.92 seconds
  • Data processing time for 63992 of 250000 records: 5.57 seconds
  • Total time for 63992 of 250000 records: 6.49 seconds
  • Memory consumption for 63992 of 250000 records: 0.25MB

Result of this benchmark

  • Data retrieval is faster with Iterators
  • Data processing is faster with Iterators
  • Memory consumption is enormously better with Iterators
Benchmark executed with MySQL 5.5.43 and PHP 5.5.26 on Ubuntu 12.04 LTS (virtual machine). Other versions of PHP, Mysql or OS might give you different results. 250000 records generated using fzaninotto/Faker.


Using simple Iterators in your PHP code you can speed up the data retrieval and processing, but the most important thing that this benchmark shows you is that Iterators will save a ton of memory.


Iterators are more effective for processing large amounts of data. For small amounts of data (aproximately under 5000 entries) Iterators might be even slower than using arrays, but you will still be winning on memory though.

Monday, June 8, 2015

20 years of php

On June 8, 1995 Rasmus Lerdorf open-sourced the code for his "Personal Homepage" and posted it to the news group comp.infosystems.www.authoring.cgi. Yes, NNTP was hot and famous back in those days! And because Ben Ramsey asked everyone in the PHP community to tell their PHP story, I felt it was time to share mine.

For me, my PHP story begins 6 years after Rasmus published his source code for PHP. In 2001 I started as System Engineer at Telenet, a cable internet provider in Belgium, where they were looking for someone to develop and maintain their website in PHP. I had worked as Perl developer in the past and found the change to PHP relatively easy.
In 2005 I started off as a freelancer working as PHP developer and worked at several positions in big enterprises and government agencies.
PHPUnit Pocket Guide
In 2006 I was introduced to PHPUnit, thanks to Sebastian Bergmann's PHPUnit Pocket Guide which seemed like a very smart way to test your applications. So that's when I started writing unit tests for the first time and never stopped writing them.
After working with PHP for 5 years, I was doubting if I was doing a good job as there was not much to compare with. So in 2006 I got Zend Certified for PHP 4 which opened up a whole new world to me. As Zend Certified Engineer (ZCE) I received an invite from Zend to join their biggest PHP event of the year:ZendCon 2007. So I saved up all that I could spare and booked my flight, ticket and hotel stay.
At ZendCon I was introduced to "The PHP Community" through the Zend Trading Cards which was a brilliant idea to introduce the "community" to newcomers in the community. And I met Mr. Cal Evans, who in his gentle voice and everlasting smile said "I was the one" to form a community myself in my own region. The last day of ZendCon, I got a phone call from my wife at 2am saying we were expecting a baby! That was probably the wildest night for me at ZendCon.

On June 8, 2008 my wife gave birth to our son Xander. Another sign that PHP would have a major impact on my life.
Together with the help of Felix De Vliegher we started PHPBelgium. Later on we joined forces with the Dutch PHP user group phpGG and formed PHPBenelux.
In 2009 my wife and I founded in2it where I would be doing the same thing as I was already doing: providing professional PHP consulting services to businesses, giving professional training courses and coach development teams to improve the way they develop PHP applications. My wife took over the graphical design part and together we have lifted the company into a known brand for web application development and design.
In that same year I was asked to speak at Dutch PHP Conference about SPL. Since that first talk I've spoken at several conferences in the world and attended even more.
In 2010 our second son Ares was born during Zend Framework bug hunt days on July 18 and our third son Tycho was born just before PHPBenelux Conference 2013. So yes, all our sons have PHP-ness since their births.

Yes, PHP has given me a basis to earn a good living, pay my bills and support my family. But it also has given me the PHP community: A big, welcoming group of people who I call my distant family. And all thanks to Mr. Cal Evans, the Godfather of the PHP Community.
PHP Godfather

Wednesday, May 27, 2015

Little things can make a difference

Source: theleticiabertin on

Wow, I never expected this much involvement when I created an overview of upcoming conferences this fall. I cannot deny I love Markdown to write simple things and I love simplicity. I use IA Writer to have distraction-free editing power on both my phone and laptop.

iA Writer

iA Writer is a minimalist text editor for OS X and iOS developed by Information Architects Incorporated. The idea of iA Writer is "to keep you focused on just writing". iA Writer has been "downloaded 600,000 times by everyone from hobbyist writers to the bestselling author Augusten Burroughs." It is the top selling text editor in the App Store behind Apple's own Pages application.

So last night, returning back from php[tek] 2015, I was noting down a small list of upcoming conferences this fall in Markdown. But my memory was a bit fuzzy from the travel I decided to paste it into a Gist on GitHub so I could ask people in my network to inform me about missing events.

And wow, it was a rush! Twitter, Facebook and other channels informed me about missing or wrong dates of conferences. People even forked my gist to help me updating the list!

And this is why I love being part of the PHP Community. You do a small thing and before you know it, people are there to help out to improve things. And when you read a tweet like the one below, you can't help it feeling good that such a little contribution could mean a lot to others.

Thanks for helping everyone, especially those who forked my gist to update it.

Tuesday, May 5, 2015

popen for cli commands and pipes in php

Source: Pipes 1/3 by Jonah G.S. on
I got a question today about using commands that pipe output to other commands within PHP applications.

There are two functions in PHP that are perfect for the task: popen and proc_open.

The function "popen" opens a process file pointer, basically you have a pointer during the execution of a process. This functionality is often useful when you have one-way traffic (like piping commands on command line).

The function "proc_open" behaves the same as popen, but gives you access to the input and the output, which makes it very useful for reading and writing as you go along.

So let's say you have logic that generates a crontab entry, you can always do this using the commandline.

/usr/bin/php crontab.php | /usr/bin/crontab
But when you want to run it as a complete process, you can go about using exec, shell_exec, passthru or system and fiddle with escapeshellcmd. But often this looks messy and not reusable.

A better approach would be to use "popen". A small example would look something like this:


$output = '*/5 * * * * /bin/echo "Hello World!" 2>&1' . PHP_EOL;
$command = '/usr/bin/crontab';

var_dump(cmdPipe($output, $command));
 * Functionality to pipe output
 * @param string $input The command that needs to be executed
 * @param string $commandline The command the first command needs to be
 * piped to
 * @return string The output of the given command
 * @throws \RuntimeException
function cmdPipe($input, $commandline)
    if (false === ($pipe = popen("echo \"$input\"|$commandline", 'r'))) {
        throw new \RuntimeException('Cannot open pipe');
    $output = '';
    while (!feof($pipe)) {
        $output .= fread($pipe, 1024);
    return $output;
DISCLAIMER: This is not secure code and should not be used as-is in production environments!

Build it as a feature element and you now have a piping functionality you can nest, embed but most of all: reuse.

Monday, April 27, 2015

Back from LoneStarPHP 2015

LoneStarPHP 2015
Listening to Phil Sturgeon (courtesy of Ben Marks) at LoneStarPHP 2015
I returned earlier this week from LoneStarPHP 2015, a community PHP conference held in Addison, a suburb of Dallas, Texas.
This conference was a three-day event, with a full day reserved for tutorial sessions where experts from all over the world gave full training and hands-on workshops on subjects like PHP foundations, unit testing, systems administration, API's, security and performance. A great decision made by the conference organizers to run a full day for training. 
Standing in line for a true Texas BBQ at Hard Eight BBQ
LoneStarPHP has a reputation to offer a true Texas BBQ to all speakers, and this year it was again a big success. We ended up at the "Hard Eight BBQ", one of the best BBQ restaurants in the US. A quarter pound of very tasty brisket, some JalapeƱo Chicken Poppers, Spicy Sausage and a few Spare Ribs were on my plate. Just the amount of meat I could handle without getting a meat overdose.
The second and third day were all about PHP. Speakers were giving 50 minute sessions starting at 9am all the way until 5pm. LoneStarPHP attendees were given the best of the best and the audience loved it.

Jeff Carrouth goes over the SOA architecture
I learned interesting things about Dependency Injection, Composer , API's, SOA's, Security, Testing, Guzzle to consume HTTP, Speaking at Conferences, Teaching Kids to Code and What it takes to run a tech company.
Between sessions there was of course the "hallway track", discussions between attendees about all kind of subjects which many consider the most important part of any conference.
PHPTownHall and LooselyCoupled doing a joint-podcast session
Of course, evening social activities brought everyone closer and kept the conversation going. When you hang out with a bunch of PHP folks, you end up with a couple of podcast recordings as well. So, PHPTownHall and LooselyCoupled joined forces one night and the amazing Godfather of the PHP community, Mr. Cal Evans could not resist the urge to record another session of Voices of the Elephpant: it's the booze talking with a round-table discussion with conference organizers and community leaders. On both occasions it was a great blast, just go listen to the podcasts.

If you like a good BBQ mixed with new, refreshing and highly educational talks, LoneStarPHP is a conference you must experience.
See you all at LoneStarPHP 2016

Friday, January 16, 2015

New Year's Cleanup for VAT validator client for VIES service by European Commission

It's a new year and that's always good to clean up things and improve code. This is exactly what I did to make it easier for you to use the VIES service.
VIES is the VAT Information Exchange System provided by the European Commission to allow its members to validate VAT registration numbers of companies registered within the European Union. Since an invoice with an invalid VAT number can result in penalties, this client library is convenient to use in any B2B application.
This tool allows you to simply embed the service within your PHP application using Composer as the client is available in Packagist. We even included tests, improved quality and examples for you to play with. The source code is freely available on GitHub under MIT License.
I wish you a very happy 2015!


Friday, November 21, 2014

Running Apigility on Azure

Apigility on Azure
Since a couple of years I've been a fan of Microsoft Azure, the cloud platform by Microsoft. It offers a platform as a service (PaaS) in the form of Azure Websites which makes it a great solution to prototype and to play with new stuff.
Last year Matthew Weier O'Phinney announced Apigility at ZendCon, a manager for API design. It was immediately clear that it would revolutionise the way we would design and manage REST API's.
mwop presenting apigility

Getting started

The first thing we need is to download apigilty on your local development machine. There are several ways to do this, but I prefer to just clone the apigilty skeleton from GitHub.
git clone dragonbe-demo-api
Since composer is included with the skeleton, we just need to self-update composer.phar and let it install all required libraries.
cd dragonbe-demo-api/
php composer.phar self-update
php composer.phar install
We're all set now. Time to get started with our first API endpoint. Let's enable development mode first so we have access to the management interface.
php public/index.php development enable
And if you run PHP 5.4 or higher (if not, now is a good time to upgrade) you can fire up the build-in PHP web server (only from PHP 5.4.8 and up) to quickly getting started with managing REST API endpoints.
php -S -t public/ public/index.php
If you now surf to http://localhost:9999 you should see the Apigility welcome screen.
Apigility Welcome Screen

Your first REST endpoint

Click on the button "Get Started" to continue.
getting started
Continue with create new API.
create api
A popup will ask you to provide a name for this API, I've named it "demo" for the purpose of this blog.
api name
The API demo (v1) will be created and an overview will be presented.
api overview
In order to have something useful from this, we need to create a new REST endpoint.
Because we're just demonstrating the usage of Apigility we're selecting the "code connect" option and we name our endpoint "user".
code connect
An overview will give us more details how to reach this user endpoint.
user overview
When we click on the tab "source code" we get a list of classes used for this user endpoint. We're interested in the Resource Class demo\V1\Rest\User\UserResource.php.
class overview
Let's test our endpoint now using curl.
curl -i http://localhost:9999/v1/user
This will give us the following response.
HTTP/1.1 405 Method Not Allowed
Host: localhost:9999
Connection: close
X-Powered-By: PHP/5.4.30
Content-Type: application/problem+json

{"type":"","title":"Method Not Allowed","status":405,"detail":"The GET method has not been defined for collections"}
This means our configuration is good and we're ready to implement the "business logic".

Let's code a bit

In UserResource.php we look at two methods:
  • fetch($id)
  • fetchAll($params = array())
The first method "fetch" will give us a single entity while the second method "fetchAll" is responsible for returning a collection of entities.
The first thing we do is creating a ficture (just for the purpose of this demo) where we have a list of users.
protected $user = array (
    array ('id' => 1, 'firstName' => 'Matthew', 'lastName' => 'Weier O\'Phinney'),
    array ('id' => 2, 'firstName' => 'Zeev', 'lastName' => 'Suraski'),
    array ('id' => 3, 'firstName' => 'Enrico', 'lastName' => 'Zimuel'),
Let's modify the collection method "fetchAll" first, as this is the easy part.
We change the existing code
public function fetchAll($params = array())
    return new ApiProblem(405, 'The GET method has not been defined for collections');
Into the following
public function fetchAll($params = array())
    return $this->user;
And when we repeat our request with curl, we get the following result.
curl -i http://localhost:9999/v1/user

HTTP/1.1 200 OK
Host: localhost:9999
Connection: close
X-Powered-By: PHP/5.4.30
Content-Type: application/hal+json

{"_links":{"self":{"href":"http:\/\/localhost:9999\/v1\/user"}},"_embedded":{"user":[{"id":1,"firstName":"Matthew","lastName":"Weier O\u0027Phinney","_links":{"self":{"href":"http:\/\/localhost:9999\/v1\/user\/1"}}},{"id":2,"firstName":"Zeev","lastName":"Suraski","_links":{"self":{"href":"http:\/\/localhost:9999\/v1\/user\/2"}}},{"id":3,"firstName":"Enrico","lastName":"Zimuel","_links":{"self":{"href":"http:\/\/localhost:9999\/v1\/user\/3"}}}]},"total_items":3}
So far so good! Now it's time to add entity functionality as well.
public function fetch($id)
    foreach ($this->user as $user) {
        if ($user['id'] === (int) $id) {
            return $user;
    return new ApiProblem(404, 'Specified user was not found');
So now we can call it with curl to see how it works.
curl -i http://localhost:9999/v1/user/1
This results in a nice entity result
HTTP/1.1 200 OK
Host: localhost:9999
Connection: close
X-Powered-By: PHP/5.4.30
Content-Type: application/hal+json

{"id":1,"firstName":"Matthew","lastName":"Weier O\u0027Phinney","_links":{"self":{"href":"http:\/\/localhost:9999\/v1\/user\/1"}}}
So now we have a collection and entity functionality for demo purposes. Time to put things online!

First deploy to Azure

Now that we have created our first API endpoint, we can prepare everything to deploy it to Microsoft Azure. There are a few things we need to do before we can actually deploy it, but it's a one-time preparation.
If you haven't downloaded it yet, get the Microsoft Azure SDK for PHP and for your development OS and install it, it will make life so much easier.
Once downloaded and installed, you now have a CLI application to manage your azure configuration. More details can be found at the Microsoft Azure PHP Dev Center.

Web root configuration

We need to add a web.config file to our code base in the root of the project as it will serve as a configuration for IIS to know where our initial application root folder will be and how to map routing to it.
My web.config for Apigility looks like this:
<?xml version="1.0" encoding="UTF-8"?>
        <directoryBrowse enabled="false" />
        <httpErrors existingResponse="PassThrough" />
            <clear />
            <!-- Rewrite rules to /public by @maartenballiauw *tnx* -->
            <rule name="TransferToPublic-StaticContent" patternSyntax="Wildcard" stopProcessing="true">
              <match url="*" />
              <conditions logicalGrouping="MatchAny">
                <add input="{REQUEST_URI}" pattern="*assets*" />
                <add input="{REQUEST_URI}" pattern="robots.txt" />
              <action type="Rewrite" url="public/{R:0}" />
            <rule name="TransferToPublic" patternSyntax="Wildcard">
              <match url="*" />
              <action type="Rewrite" url="public/index.php" />
                <clear />
                <add value="index.php" />
                <add value="index.html" />
Again, a big shout-out to Maarten Balliauw for helping me creating this simplified configuration file for IIS / Microsoft Azure.

Auto-execute Composer install on Azure

Apigility is dependent on Composer, the popular package manager for PHP. It's a great tool for managing external libraries and dependencies, but it requires to be executed before the application gets deployed.
This is where we will use the Microsoft Azure SDK as we're going to need a deployment execution immediately after the code is pulled in from GitHub.
I found the article of Ahmed Sabbour very useful as a guideline to prepare post-fetch deployments. We will use the Microsoft Azure SDK to generate our deployment scripts.
azure site deploymentscript --php -t bash 
Which will create 2 files: .deployment and It's the latter which we need to modify to run composer automatically after fetching our code from GitHub.
Just add the following lines in section "# Deployment" of the script (right after # 1. KuduSync configuration).
# 2. Composer install
php $DEPLOYMENT_TARGET/composer.phar install -v --prefer-dist --no-dev --optimize-autoloader --no-interaction

Prepare for GitHub

Maybe it's time to save and commit our changes. After all, we already have a git repository active.
Let's first figure out what has changed so far.
git status
This give us the following list
On branch master
Your branch is up-to-date with 'origin/master'.

Changes not staged for commit:
  (use "git add ..." to update what will be committed)
  (use "git checkout -- ..." to discard changes in working directory)

    modified:   composer.phar
    modified:   config/application.config.php

Untracked files:
  (use "git add ..." to include in what will be committed)


no changes added to commit (use "git add" and/or "git commit -a")
We need to add 5 new files to GIT and update 2 files.
git add .deployment config/application.config.old module/demo/ web.config
git commit -am 'First commit of our Demo API'
We still have a small challenge in regards to the Apigility project itself. All our code is still tight directly to the project's repository. We rename the remote location of ZF-Apigility-Skeleton from origin to upstream.
git remote rename origin apigility
Time to create a new project on GitHub for our demo. Log into your account on GitHub or register for a new account. Once in your account, register a new repository.
new repo on GitHub
You will get instructions to add your online repository to your local GIT repository, but in basis it comes down to the following commands.
git remote add origin
git push -u origin master
Check back online to see your code is there.
verify on GitHub
All is looking good so far. Let's leave our code for a minute because we now want to deploy on Microsoft Azure.

Azure time

Microsoft Azure is a cloud platform operated by Microsoft which provides a rich set of services and option to fulfill your scalability needs. One of the benefits of this platform over others is that it provides a platform as a service (PaaS) in the form of Azure Websites allowing you to "just deploy" your existing application to the cloud without a bunch of headaches and reconfigurations.
Especially in the world of PHP web application development, this is a huge benefit to launch prototypes, try scalability improvements and so on. The fact it runs IIS on a Windows OS in the background should not hold you down. PHP runs as good on Windows as on Linux.
If you don't have a Microsoft Azure account yet, you can sign up for the free trial and try out these steps yourself.
azure free
Once signed up you will be taken to the management dashboard. This is where the magic happens. We need to create a new website, so we click on the "+" (plus sign) in the lower left corner and select "compute", "new webiste" and select "custom create".
new website
You will be presented with a popup where you can set a name for your app, create or use an existing hosting plan, your preferred region and a database (optional). The last item is what we were looking for: publish from source control.
website settings
The next screen will provide a collection of publishing options you can choose from. Of course in this example we select GitHub as publisher.
github publisher
It will use OAuth authentication to gain access to your personal repositories, so you can select the correct repository from a drop-down list and define the branch you want to monitor for changes. We just take the default branch "master".
github repo selection
Once all detailes are filled out, the magic starts happening: a new website instance will be created, your code will be deployed on this instance and your composer installation will be executed. All in one go.
auto deployment
When the publishing is finished, a simple process log showing you exactly what was executed. A more detailed log is available for the deployment after publishing (the composer installation of packages).
deployment succesful

We're live

So if you followed these instructions completed, we are now live with our API. Let's check it!
curl -i
This will give us the following response
HTTP/1.1 200 OK
Content-Length: 549
Content-Type: application/hal+json
Server: Microsoft-IIS/8.0
X-Powered-By: PHP/5.4.34
X-Powered-By: ASP.NET
Set-Cookie: ARRAffinity=2ffde9a06ea00acd603e02c27a9e6799c74c65620925e3f7a78146f02026b7a9;Path=/;
Date: Fri, 21 Nov 2014 08:48:35 GMT

{"_links":{"self":{"href":"http:\/\/\/v1\/user"}},"_embedded":{"user":[{"id":1,"firstName":"Matthew","lastName":"Weier O\u0027Phinney","_links":{"self":{"href":"http:\/\/\/v1\/user\/1"}}},{"id":2,"firstName":"Zeev","lastName":"Suraski","_links":{"self":{"href":"http:\/\/\/v1\/user\/2"}}},{"id":3,"firstName":"Enrico","lastName":"Zimuel","_links":{"self":{"href":"http:\/\/\/v1\/user\/3"}}}]},"total_items":3}
All good, what about our entity?
curl -i
Gives us back the entity of Zeev Suraski.
HTTP/1.1 200 OK
Content-Length: 135
Content-Type: application/hal+json
Server: Microsoft-IIS/8.0
X-Powered-By: PHP/5.4.34
X-Powered-By: ASP.NET
Set-Cookie: ARRAffinity=2ffde9a06ea00acd603e02c27a9e6799c74c65620925e3f7a78146f02026b7a9;Path=/;
Date: Fri, 21 Nov 2014 08:50:48 GMT



Apigility allows you to build and manage API's very easily. Combining this flexibility with the cloud power of Microsoft Azure you will have a very powerful tool to build scalable, high performance and easy to manage API's.