wording. The tests are currently evaluated using a simple
<command>diff</command> comparison against the outputs
generated on a reference system, so the results are sensitive to
small system differences. When a test is reported as
<quote>failed</quote>, always examine the differences between
expected and actual results; you might find that the
differences are not significant. Nonetheless, we still strive to
maintain accurate reference files across all supported platforms,
so it can be expected that all tests pass.
</para>
<para>
The actual outputs of the regression tests are in files in the
<filename>src/test/regress/results</filename> directory. The test
script uses <command>diff</command> to compare each output
file against the reference outputs stored in the
<filename>src/test/regress/expected</filename> directory. Any
differences are saved for your inspection in
<filename>src/test/regress/regression.diffs</filename>.
(When running a test suite other than the core tests, these files
of course appear in the relevant subdirectory,
not <filename>src/test/regress</filename>.)
</para>
<para>
If you don't
like the <command>diff</command> options that are used by default, set the
environment variable <envar>PG_REGRESS_DIFF_OPTS</envar>, for
instance <literal>PG_REGRESS_DIFF_OPTS='-c'</literal>. (Or you
can run <command>diff</command> yourself, if you prefer.)
</para>
<para>
If for some reason a particular platform generates a <quote>failure</quote>
for a given test, but inspection of the output convinces you that
the result is valid, you can add a new comparison file to silence
the failure report in future test runs. See
<xref linkend="regress-variant"/> for details.
</para>
<sect2 id="regress-evaluation-message-differences">
<title>Error Message Differences</title>
<para>
Some of the regression tests involve intentional invalid input
values. Error messages can come from either the
<productname>PostgreSQL</productname> code or from the host
platform system routines. In the latter case, the messages can
vary between platforms, but should reflect similar
information. These differences in messages will result in a
<quote>failed</quote> regression test that can be validated by
inspection.
</para>
</sect2>
<sect2 id="regress-evaluation-locale-differences">
<title>Locale Differences</title>
<para>
If you run the tests against a server that was
initialized with a collation-order locale other than C, then
there might be differences due to sort order and subsequent
failures. The regression test suite is set up to handle this
problem by providing alternate result files that together are
known to handle a large number of locales.
</para>
<para>
To run the tests in a different locale when using the
temporary-installation method, pass the appropriate
locale-related environment variables on
the <command>make</command> command line, for example:
<programlisting>
make check LANG=de_DE.utf8
</programlisting>
(The regression test driver unsets <envar>LC_ALL</envar>, so it
does not work to choose the locale using that variable.) To use
no locale, either unset all locale-related environment variables
(or set them to <literal>C</literal>) or use the following
special invocation:
<programlisting>
make check NO_LOCALE=1
</programlisting>
When running the tests against an existing installation, the
locale setup is determined by the existing installation. To
change it, initialize the database cluster with a different
locale by passing the appropriate options
to <command>initdb</command>.
</para>
<para>
In general, it is advisable to try to run the
regression tests in the locale setup that is wanted for
production