[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Option Change ?



On Wed, Jun 25, 2003 at 05:33:58PM -0400, Fu, Chen wrote:
> It seems to me that there are several changes on command line options in the
> new version.

Yes, some of the options are slightly different as of 2.0. The new
options are documented in the Soot tutorials, distributed with Soot in
the directory tutorial, as well as on-line at
http://www.sable.mcgill.ca/soot/tutorial/
Also, running Soot with the -h switch gives very brief but up-to-date
documentation of all the options.

> -a  is no longer valid. is it replaced by -w?  But which phases
> automatically enabled by -w?

Roughly speaking, yes. It was never clearly defined what -a was supposed
to do, and it was interpreted differently by different phases. The -w
switch is not exactly equivalent to what -a did, but in cases where you
used -a before, you will most likely want to use -w now.

-w tells Soot that you want it to do whole-program analyses.
It enables the packs containing whole-program analyses.

> And about phase options.
> 
> I wrote a SceneTransformer named "ExceptionSover", and I register it in my
> own main function using the code below and then call soot.Main
> 
> PackManager.v().getPack("wjtp").add(new Transform("wjtp.exception",
> ExceptionSolver.v()));
> 
> 
> And inside my own phase, I need to call spark, so I wrote:
>     SparkTransformer.v().transform( phaseName + ".spk" );
> 
> I was able to using "-p wjtp.exception.spk  XXXXX" to pass some options to
> the spark I'm using when in soot 1.2.5.    But now the soot complained about
> this phase does not exist.

One of the problems with earlier versions of Soot was that there was no
way to detect errors in the phase options, so Soot silently ignored
command lines with misspelled phase names, causing lots of frustration.

As of Soot 2.0, only real, registered transforms can accept phase
options.

> Would you please help me on how to get around this?  I'm calling spark
> inside my own phase is because I need to do something before and after the
> points to analysis.

In this case, I think the cleanest solution is to make your modified
version of Spark (which needs to do something before and after) a real,
registered transform. To do this, subclass SparkTransformer, and
implement the internalTransform method like this:

    protected void internalTransform( String phaseName, Map options )
    {
        // stuff to do before Spark

        // run Spark
        super.internalTransform( phaseName, options );

        // stuff to do after Spark
    }

Then add this transformer to the cg pack. 

Assuming you want this modified transformer to take command-line options
(presumably the same ones as Spark), you will have to call
setDeclaredOptions() and setDefaultOptions() on the Transform that you
create. You can get the Declared and Default options for the standard Spark
by calling Options.v().getDeclaredOptionsForPhase("cg.spark") and
Options.v().getDefaultOptionsForPhase("cg.spark"). So your main function
would have something like this:

Transform mySpark = new Transform("cg.myspark", MySparkTransformer.v()));
mySpark.setDeclaredOptions(
             Options.v().getDeclaredOptionsForPhase("cg.spark"));
mySpark.setDefaultOptions(
             Options.v().getDefaultOptionsForPhase("cg.spark"));
PackManager.v().getPack("cg").add(mySpark);

Whether or not you decide to merge your ExceptionSolver with your
MySparkTransformer is up to you. Personally, I would keep them separate,
especially if ExceptionSolver is able to use call graphs generated with
things other than Spark as well (such as CHA).

Ondrej