AMPL > >Try AMPL > >Run AMPL on NEOS


To simplify the work of comparing and testing solvers, we have made AMPL and solver resources available “in the cloud” in collaboration with the NEOS Server project, under the auspices of the Wisconsin Institutes for Discovery. AMPL users can interact with the NEOS server in either of two ways:

  • by requesting execution of both AMPL and solvers at a remote site, or

  • by using a local AMPL session to send optimization problems to remote solvers.

These services are available free of charge through any Internet connection. They are intended mainly for testing, prototyping, and instructional purposes, however; pursuant to the posted terms of use, there are no guarantees of performance or confidentiality.

Remote AMPL with remote solvers: The NEOS website

This facility lets you send an AMPL “job” to one of the NEOS Server’s remote computers. A job consists of an AMPL model file, plus optionally an associated data file, plus optionally a commands file containing an AMPL script to be executed.

When you submit a job, the NEOS Server’s central scheduler locates an appropriate remote computer on which to run it, holding the job in a queue if necessary. When a remote computer is available, the server sends it instructions to start up a copy of the AMPL processor, and it handles the rest of the computations. Thus the remote computer automatically reads and executes the submitted files, runs the requested solver, and gathers the resulting output for return to the Server. The Server reports the output on the submission website and optionally emails a copy to the user. AMPL’s display or printing commands can be included in the command file to produce listings of results that are sent back along with other AMPL output.

To use this facility, follow a link below to consult the page of instructions for the solver in which you are interested. Solvers currently available for AMPL modeling include the following:

The following solvers also take AMPL input, but have a limited AMPL interface that does not allow for a command file (or “run file”):

Each solver submission page provides summary solver information, links for additional information sources, and specific instructions for job submission. Three methods of communication with the NEOS Server are described:

  • web submission, by entering local filenames directly on the solver submission page;

  • e-mail submission, using a specified format for which a template is provided;

  • communication from a client program in a supported language (Python, Perl, PHP, C, C++, Java, Ruby) through calls to a NEOS API based on XML-RPC

Use the links in the upper right-hand corner of the submission page to select your submission method, or to request a menu of sample problems for web submission.

Local AMPL with remote solvers: The Kestrel client

In this mode of operation, you run your own copy of AMPL on your local computer. But instead of specifying a solver installed on your computer or local network, you invoke Kestrel, a “client” program that sends your problem to a solver running on one of the NEOS Server’s remote computers. The results from the NEOS Server are eventually returned through Kestrel to AMPL, where you can view and manipulate them locally in the usual way. Thus you get all the benefits of using AMPL environment, without having to first obtain and install each solver you want to try.

The introduction below covers everything you need to know to start using Kestrel with AMPL. Information about more advanced features and other uses of Kestrel can be found in Kestrel: An Interface from Optimization Modeling Systems to the NEOS Server.

Downloading and installing the Kestrel client

To get started using this feature, you must obtain the free Kestrel program. Begin by downloading one of the following compressed bundles of files:

Unpack the bundle into the same folder or directory that contains the AMPL executable. The Kestrel client requires Python on Unix-like systems, such as Linux and OS X, where it is usually preinstalled. If your system doesn’t have Python, install it by going to Python Setup and Usage and following the directions for your platform there.

Selecting and running a solver with the Kestrel client

The Kestrel client program works like a special kind of solver. Once you have installed the Kestrel executable, you tell AMPL to use Kestrel by setting the option solver to kestrel. You then set the option kestrel_options to indicate which remote solver you want to access through Kestrel. Directives for the chosen solver are specified in the usual way through an option having a name of the form solvername_options.

Additionally, NEOS now requires an email address with every submission. You send your address to Kestrel by setting option email.

As an example, here is how you might invoke Kestrel from a local AMPL session, using LOQO as your remote solver:

ampl: model steelT.mod;
ampl: data steelT.dat;
ampl: option solver kestrel; ampl: option kestrel_options 'solver=loqo'; ampl: option loqo_options 'minlocfil sigfig=8 outlev=2'; ampl: option email '';
ampl: solve; Connecting to: Job 10219877 submitted to NEOS, password='MipoAHfd' Check the following URL for progress report : Job 10219877 dispatched password: MipoAHfd ---------- Begin Solver Output ----------- Condor submit: 'neos.submit' Condor submit: 'watchdog.submit' Job submitted to NEOS HTCondor pool.
LOQO 7.00: optimal solution (15 QP iterations, 15 evaluations) primal objective 515032.9977724714 dual objective 515033.0024976235 ampl:

That’s it! You are returned to the ampl: prompt and can proceed to display or manipulate the solution or to do anything else that you would have done if the problem had been solved by a local copy of LOQO. Subsequent uses of the solve command will continue to invoke a remote solver, until you change the option solver or conclude the AMPL session.

In general, you choose a remote solver through your setting of the AMPL option kestrel_options. The form of the AMPL command for this purpose is

option kestrel_options 'solver=solvername';

The following choices for solvername are currently recognized:

A current list can be obtained by specifying

option kestrel_options 'solver';

You can pass directives to any of these solvers by assigning an appropriate directive string to the AMPL option consisting of the solver’s name followed by _options. In the previous example, we requested LOQO’s “min local fill” option, specified agreement of primal and dual objectives to 8 significant figures, and turned on detailed iteration output by giving the following AMPL command prior to solving:

option loqo_options 'minlocfil sigfig=8 outlev=2';

For information on directives recognized by the available solvers, follow the links from their names in the listing above.

Viewing output and results

To look at the solver’s output while it is running, point your browser at the URL given in the Kestrel output as shown above, and click on “View Intermediate Results” in the web page that appears. This will take you to another page that shows all of the output produced by the solver for your problem so far. To track the solver’s progress, simply update this page periodically.

To retrieve results from a previous Kestrel run, first set up the same AMPL model and data that you used when submitting your problem. Then set kestrel_options to specify the job number and password that Kestrel reported when it processed the job. For the example above, the appropriate AMPL commands would be as follows:

ampl: model steelT.mod;
ampl: data steelT.dat;
ampl: option solver kestrel;
ampl: option kestrel_options 'job=2746671 password=AnVsgUKc';
ampl: solve;

Following the solve, Kestrel contacts the NEOS Server to retrieve the results from the specified job. The display of the solver output and the return of the results to AMPL then proceed exactly as previously described. This feature enables you to retrieve results for any problem that has been successfully sent from Kestrel to the NEOS Server, even if Kestrel and AMPL have been terminated before results can be received. The NEOS Server only keeps previous results for a short time, however — typically a day — so this feature is not appropriate for archiving of runs.

Managing Kestrel runs

To submit multiple optimization runs and then retrieve them in the order that they were submitted, use the AMPL scripts kestrelsub and kestrelret that are unpacked along with the Kestrel executable. Detailed instructions are given in Kestrel: An Interface from Optimization Modeling Systems to the NEOS Server. This feature should be used with care, so as not to overload the Server resources.

To cancel an optimization run before the solver has completed execution, set option kestrel_options to specify the appropriate job number and password, and then run the AMPL script kestrelkill. For example:

ampl: option kestrel_options 'job=6893 password=FaahsrIh';
ampl: commands kestrelkill;

To insure that AMPL will find the script, place the file kestrelkill in the directory (or folder) that will be current when you execute AMPL, or set option ampl_include to specify the directory where the script can be found.