aggregation logs, as well as in the main and per-script reports, use the
<option>--failures-detailed</option> option. If you also want to distinguish
all errors and failures (errors without retrying) by type including which
limit for retries was exceeded and how much it was exceeded by for the
serialization/deadlock failures, use the <option>--verbose-errors</option>
option.
</para>
</refsect2>
<refsect2>
<title>Table Access Methods</title>
<para>
You may specify the <link linkend="tableam">Table Access Method</link>
for the pgbench tables. The environment variable <envar>PGOPTIONS</envar>
specifies database configuration options that are passed to PostgreSQL via
the command line (See <xref linkend="config-setting-shell"/>).
For example, a hypothetical default Table Access Method for the tables that
pgbench creates called <literal>wuzza</literal> can be specified with:
<programlisting>
PGOPTIONS='-c default_table_access_method=wuzza'
</programlisting>
</para>
</refsect2>
<refsect2>
<title>Good Practices</title>
<para>
It is very easy to use <application>pgbench</application> to produce completely
meaningless numbers. Here are some guidelines to help you get useful
results.
</para>
<para>
In the first place, <emphasis>never</emphasis> believe any test that runs
for only a few seconds. Use the <option>-t</option> or <option>-T</option> option
to make the run last at least a few minutes, so as to average out noise.
In some cases you could need hours to get numbers that are reproducible.
It's a good idea to try the test run a few times, to find out if your
numbers are reproducible or not.
</para>
<para>
For the default TPC-B-like test scenario, the initialization scale factor
(<option>-s</option>) should be at least as large as the largest number of
clients you intend to test (<option>-c</option>); else you'll mostly be
measuring update contention. There are only <option>-s</option> rows in
the <structname>pgbench_branches</structname> table, and every transaction wants to
update one of them, so <option>-c</option> values in excess of <option>-s</option>
will undoubtedly result in lots of transactions blocked waiting for
other transactions.
</para>
<para>
The default test scenario is also quite sensitive to how long it's been
since the tables were initialized: accumulation of dead rows and dead space
in the tables changes the results. To understand the results you must keep
track of the total number of updates and when vacuuming happens. If
autovacuum is enabled it can result in unpredictable changes in measured
performance.
</para>
<para>
A limitation of <application>pgbench</application> is that it can itself become
the bottleneck when trying to test a large number of client sessions.
This can be alleviated by running <application>pgbench</application> on a different
machine from the database server, although low network latency will be
essential. It might even be useful to run several <application>pgbench</application>
instances concurrently, on several client machines, against the same
database server.
</para>
</refsect2>
<refsect2>
<title>Security</title>
<para>
If untrusted users have access to a database that has not adopted a
<link linkend="ddl-schemas-patterns">secure schema usage pattern</link>,
do not run <application>pgbench</application> in that
database. <application>pgbench</application> uses unqualified names and
does not manipulate the search path.
</para>
</refsect2>
</refsect1>
</refentry>