Recent Changes - Search:

edit SideBar

FMICertification

There is no FMI Certification, but there is FMI Compatibility Checking.

https://www.fmi-standard.org/tools is a list of FMI tools that shows compatibility. The process to add a tool to the table includes adding files to a svn repo where an svn commit-hook runs some validation tests when the files are committed.

Documentation

Files

There are two types of checking:

  1. Checking for tools that export FMUs (We have some handwritten FMUS that could be checked in)
  2. Checking for tools that import FMUs (Ptolemy can import FMUs)

Files to be checked in for tools that export FMUs

In Section 5, Rule #9, the FMI Cross Check Rules describe what files are to be check in for tools that export FMUs: "To submit per exported FMU stored on the SVN server:"

"To submit per exported FMU stored on the SVN server:"

  • {FMUName}.fmu: The FMU. If the FMU cannot be provided (e.g. because it contains critical intellectual property), submit a file {FMUName}.nofmu. The ReadMe file shall contain information about how to get access to that FMU directly from the exporting

tool vendor.

  • {FMUName}_ref.csv: Reference solution as computed by the exporting tool. It is recommended to limit the file to at most 10 of the important variables.
  • {FMUName}_in.csv: optional input signals in case the FMU has inputs. If intermediate values are required for continuous signals, linear interpolation is to be applied.
  • A ReadMe.txt or ReadMe.pdf with (see Appendix C for an example Reade.txt), e.g.:
    • description of the FMU (features, intent, compile/link details,...)
    • email address where to contact exporting company in case of import problems
  • {FMUName}_cc.bat: A batch file to run the experiment with the ComplianceChecker for Windows platforms, {FMUName}_cc.sh for *nix platforms
  • {FMUName}_cc.log: The log of the ComplianceChecker with minimal log level 3 (warning): -l 3. This allows smaller logs in case of excessively large files produced with the default -l 4.
  • {FMUName}_cc.csv: Result data for selected signals from the tool for that simulation as .csv file
  • {FMUName]_ref.opt: Options used to create reference output and to guide comparing against, CSV format, required elements:
    • StartTime, 0.0 // in seconds
    • StopTime, 0.0 // in seconds
    • StepSize, 0.01 // in seconds, 0.0 means variable step solver
    • RelTol, 0.0001
    • optional elements:
      • AbsTol, 2
      • SolverType, FixedStep // see implementation notes for a list of predefined types
      • OutputIntervalLength // reference data provided with this time spacing in seconds
  • notCompliantWithLatestRules: If a submission does not comply with the latest submission rules, this file is present. To remove this file, vendors are allowed to adapt all files to comply with the newest rules (except {FMUName}.fmu).

"Observe the naming conventions given here, including case. We recommend keeping {FMUName} short to avoid path length restriction problems on platforms like Windows."

Rules for tools that import FMUs

In Section 5, Rule #10, the FMI Cross Check Rules describe what files are to be check in for tools that import FMUs:

"The FMU importing tools must report on importing for all Reference FMUs available for the supported FMI Variant and supported platforms provided on the SVN server (Note: not required to “pass” because of potentially missing capabilities of the FMU) and must successfully import at least 3 FMUs for at least one supported platform of at least 3 exporting tools, and run for as long as the {FMUName]_ref.opt states and supply a .csv file of the solution they computed to receive an "Available" status.

Rule #11 states:

"To submit per imported FMU:"

  • "A ReadMe.txt or ReadMe.pdf with"
    • "a description of how to import and simulate each of the FMUs, if no test setup is provided"
    • "in case of failure to run: an analysis of the reasons. This file is only needed if either test setup or test failure have to be described."
  • "A test setup for the importing tool to simplify verification of the test run by anyone who licensed the importing tool. Ideally this uses some kind of automation provided by the importing tool."
  • "{FMUName}_out.csv: Computed results as CSV file (CSV file format see Appendix B) for the same variables as given in the reference CSV file"
  • "In order to classify the result as “passed”, the results should correspond to the reference solution."
  • Vendors are encouraged to produce a screen-shot of the results and the reference solution as displayed in the importing tool (for "important" signals) for simpler validation of their claim "passed".

"Vendors of tools that import FMUs for the "c-code" platform and would like to be listed in the Cross-Check Table have to organize one-to-one tests with exporting tool vendors in case they are not providing “c-code” FMUs publicly to produce testimonials for successful Cross-Check results."

Possible Plan for Ptolemy

There are various FMUs at https://svn.fmi-standard.org/fmi/branches/public/Test_FMUs/FMI_2.0

Focusing on FMI-2.0 for CoSimulation:

  • There are almost no darwin64 test FMUs.
    • We primarily develop under Mac OS X.
      • We could contact the FMUSDK developers and provide FMUSDK darwin64 test fmus.
  • FMUSDK has only win32, win64.
    • We are most familiar with FMUSDK and our nightly build runs under linux64.
      • We could contact the FMUSDK developers and provide FMUSDK linux64 test fmus.

In the near term, we could just use the FMUSDK FMUs that are in the ptII tree, and then submit them later.

https://svn.fmi-standard.org/fmi/branches/public/CrossCheck_Results/FMI_2.0/CoSimulation/ contains the results, we could mimic what is in https://svn.fmi-standard.org/fmi/branches/public/CrossCheck_Results/FMI_2.0/CoSimulation/win64/Dymola/2016FD01/FMUSDK/2.0.4/BouncingBall/

Note that FMUSDK 2.0.4 has been released. It would be good to update fmusdk 2, which supports Linux and Mac.

Note that the Ptolemy Continuous domain will probably not work with regular FMI-2.0 FMUs because FMI-2.0 FMUs do not support a zero step size. We have an extension for this that will fix that problem, but this indicates that the models would need to be SDF.

We have models that test the FMUSDK fmus. These models would need to be adapted to write csv.

For example, fmus in $PTII/ptolemy/actor/lib/fmi/fmus that end in 20 are FMI-2.0 CS:

  • bouncingBall20
  • dq20
  • inc20
  • values20
  • vanDerPol2

FMUS that end in ME20 are FMI-2.0 Model Exchange:

  • bouncingBallME20
  • dqME20
  • fourInputsFourOutputsME20
  • incME20
  • valuesME20
  • vanDerPolME20

Running make update in the fmus directory compiles these and updates actor/lib/fmi/test/auto.

actor/lib/fmi/test/auto contains tests like:

  • FMUDq20.xml
  • FMUSDFBouncingBall20.xml
  • FMUSDFInc20.xml
  • FMUValues20.xml
  • FMUVanDerPol20.xml

(There are similar tests for the FMI-2.0 ME fmus with names like ME20.xml)

So, for each of these tests, we need to connect a CSVWriter and then compare the results with the known good results at https://svn.fmi-standard.org/fmi/branches/public/Test_FMUs/FMI_2.0/CoSimulation/

However before we get started, we should reorganize actor/lib/fmus so that it is hierarchical:

  • actor/lib/fmi/fmus/fmi10
    • actor/lib/fmi/fmus/fmi10/CoSimulation
    • actor/lib/fmi/fmus/fmi10/ModelExchange
  • actor/lib/fmi/fmus/fmi20
    • actor/lib/fmi/fmus/fmi20/CoSimulation
    • actor/lib/fmi/fmus/fmi20/ModelExchange

Don't use dots in directories, so avoid fmi2.0. fmi-2-0 or FMI-2-0 would be ok

Comparing CSV files

It is useful to be able to compare the csv output of the FMUCheck with the output of Ptolemy.

csvcomparator

  • http://sourceforge.net/projects/csvcomparator/ - Java based comparator, see Compare CSV files (stackoverflow)
    • To run, use something like java -jar ~/src/CSVCompare/CSVComparator.jar -olderFile=CoupledClutches_cc_orig.csv -youngerFile=CoupledClutches_cc.csv.
    • Note that the output is created in C:/temp/CVSComparator/compare_result.csv, which is not right. There is a -resultFile= argument that must refer to an absolute file path:
bash-3.2$ java -jar ~/src/CSVCompare/CSVComparator.jar -olderFile=CoupledClutches_cc_orig.csv -youngerFile=CoupledClutches_cc.csv -resultFile=foo.csv
[INFO] Start Comparing...
[INFO] Write Result...
Exception in thread "main" java.lang.NullPointerException
        at java.io.File.<init>(File.java:277)
        at com.csvcomparator.methods.CSVMethods.writeCSVData(CSVMethods.java:92)
        at com.csvcomparator.main.Main.main(Main.java:68)
bash-3.2$ java -jar ~/src/CSVCompare/CSVComparator.jar -olderFile=CoupledClutches_cc_orig.csv -youngerFile=CoupledClutches_cc.csv -resultFile=/tmp/foo.csv
[INFO] Start Comparing...
[INFO] Write Result...
[INFO] Done!!!
bash-3.2$

csv-compare (from Modelica)

bash-3.2$ cd /tmp
bash-3.2$ git clone https://github.com/modelica-tools/csv-compare.git
Cloning into 'csv-compare'...
remote: Counting objects: 393, done.        
remote: Compressing objects: 100% (44/44), done.        
Receiving objects:  97% (382/393), 388.01 KiB | 762.00 KiB/s   remote: Total 393 (delta 17), reused 0 (delta 0), pack-reused 349    \

Receiving objects: 100% (393/393), 397.73 KiB | 762.00 KiB/s, done.
Resolving deltas: 100% (209/209), done.
Checking connectivity... done.
bash-3.2$ cd csv-compare
bash-3.2$ make
make: *** No rule to make target `Modelica_ResultCompare/Properties/AssemblyInfo.cs.template', needed by `Modelica_ResultCompare/Properties/AssemblyInfo.cs'.  Stop.
bash-3.2$

FMUCheck

The fmuCheck binary can be used to create a reference csv file that is then compared against the output of the tool.

To see what options fmuCheck takes, run it without command line arguments.

One option is quite useful: -f Print all variables to the output file. Default is to only print outputs.

Should Ptolemy be able to support this?

See Also

External Resources

Edit - History - Print - Recent Changes - Search
Page last modified on February 02, 2016, at 10:23 PM