This manual provides information about managing a VOV project, including starting the server and connecting computing resources such as computers and licenses.
The Cadence Analog Design Environments (ADE 5.1) supports distributed processing
(DP), using either LBS (Load Balancing System), which is a proprietary Cadence
system based on rsh, or LSF (previously by Platform Computing,
Ltd., now IBM). To use Altair Accelerator with ADE 5.1:
Make ADE believe that it is using LBS
Use the "Job Submit Form" to specify the command line to dispatch the jobs
to Altair Accelerator
Set Up a Minimal LBS Queue Using Accelerator
The following steps describe how to set up a minimal LBS queue and to tell ADE to use
Altair Accelerator instead.
Pick a machine to use as the LBS master (such as 'host1'). You should be able
to RSH to this machine without entering a password.
Add the following line to the
environment
used to run the Cadence tools.
# Add this to the environment definition file
# $VOVDIR/local/environments/CADENCE.start.tcl
# (An example is available in
# $VOVDIR/eda/Cadence/environments/CADENCE.start.tcl)
setenv LBS_CLUSTER_MASTER host1
In the above example, the CADENCE.start.tcl file is the
one that defines the VOV environment named 'CADENCE'. By defining
LBS_CLUSTER_MASTER in the environment script, it will be set for jobs
running in that environment.
% ves CADENCE,5.1
% printenv | grep LBS
% which icfb
Create a minimal configuration file for LBS. Call this file
lbsconfig.
This file says that we have one queue called 'samplequeue' that has 1 host and
that such host is called 'host1' and has capacity 1.
Remember that we are
just trying to persuade ICFB that we are using LBS, when in fact we are
going to use Altair Accelerator.
Example of minimal config file for
LBS (e.g. called
lbsconfig:
samplequeue 1
cheetah 1
In the above
example, the file configures a queue named 'samplequeue' having one
execution host named 'cheetah' that has one job slot. This queue is not
actually used, but is necessary to permit using the distributed option in
ADE.
Launch Cadence LBS on the $LBS_CLUSTER_MASTER machine with:
% cdsqmgr $cwd/lbsconfig >& lbs.log &
Optional: You can also check the setup with:
% cdsDPSetupChk -shell csh -mode n
Note: If you are using ssh instead of rsh, you
may need to modify the script cdsDPrsh. Otherwise, you
can skip this step altogether.
Update your ~/.cdsinit file. The simplest change is to add
the following lines:
Launch ICFB in an environment that contains both icfb and spectre:
% ves CADENCE,5.1+MMSIM
% which icfb
% which spectre
% icfb &
In the Simulation window, select Setup > Simulation/Directory/Host and change Host Mode to distributed.
Click on Check setup to make sure ADE agrees that all is
ok.
Click Simulation > Netlist and then click Run, or click Simulation > Run to display the Job Submit form shown here:
Select the field labeled command and enter the Altair Accelerator
submission command.
This will be prefixed to all the simulation commands, and will submit
the jobs to the Altair Accelerator farm. You will typically use something
like:
nc run -D
-C cadence_ade -J ADE_@JOB@
where
cadence_ade is the name of a job class that defines the
submission parameters for Spectre jobs.
In the command field, use symbolic values @JOB@ and @USER@.
Make sure spectre is run in a preemptable mode
For spectre to be preemptable, you need to run it with the
+lsusp option, which means that we can send SIGTSTP to the
spectre process to release the licenses. You can define this option in the
environment options form.
You can fix this by configuring Altair Accelerator's
equiv.tcl file, or work around it by using the
-D option in the prefix you enter in the Job Submit form,
for example:
vnc run -D -C spectre
The -D switch asserts that the run directory is the same and
reachable from any vovtasker host. For documentation, see
Submitting
Jobs.
For information on VOV Logical Filesystem Names, configured in the
equiv.tcl file, please refer to
Equivalence
File. Usually the Altair Accelerator administrator will
need to be involved to modify this configuration file.
To stop the simulation, use the Accelerator GUI or the nc
stop command.