Here’s yet another reason why PHP is deficient when it comes to doing Real Work with Real Databases.
We’re currently working on a PostgreSQL-based database with six million book records. We want to walk
the book table and slurp up the rows as we go. The code looks like this.

$sql = 'select field1, field2, ... field8 from book';
$set = pg_query( $sql );
while ( $row = pg_fetch_row( $set ) ) {
// Do some processing
}
pg_free_result( $set );

Nothing crazy, right? I’m just walking the table, one row at a time. However,
the PHP PostgreSQL functions slurp up
all six million rows, even though my calling code doesn’t need to see all the rows at once.
This eats up tons of RAM and makes us worry about memory allocation for the PHP process.
too bad for us users! We can’t even get to the first pg_fetch_row() call because the pg_query slurps up all the rows immediately.

Now, there IS functionality where you can retrieve a given row, like:

$row = pg_fetch_row( $set, $rowno );

In this case, it’s necessary to iterate to that given $rowno, but even then you wouldn’t have to get to the end of the dataset.

If that’s not lazy enough coding on the back end, check out
this little nugget from the docs for the pg_free_result
function:

pg_free_result() frees the memory and data associated with the specified PostgreSQL query result resource.

This function need only be called if memory consumption during script execution is a problem. Otherwise, all result memory will be automatically freed when the script ends.

The PHP docs encourage sloppy coding. Rather than promoting good, safe programming behaviors, they’re saying “Eh, no big deal.” It’s like the PHP team knows that
PHP won’t be used on large datasets, and it becomes a self-fulfilling prophecy.

PHP needs to be left to the ghetto of guestbooks and the odd bulletin board system. Leave the real work to the real languages.