Fortify Static Code Analyzer (SCA) is a Static Application Security Testing (SAST) tool. It can be used to identify security issues early in the development cycle, enabling developers to resolve findings without waiting until the end. This shifting left of security analysis both speeds up and makes more secure the implementation of new functionality and features.

When integrated into a CI/CD pipeline, Fortify SCA scanning occurs on every commit — and if using a branching strategy — any issues can be resolved prior to merging back into the main branch.

Normally, results from a Fortify SCA scan are submitted to Fortify Software Security Center (SSC), which acts as a central location to interpret and display the scan results. There is an officially supported Fortify Jenkins plugin, that requires Fortify SCA to be installed alongside Jenkins, and leverages SSC for result reporting.

In this two part blog series, I will describe an alternative approach: leveraging SonarQube’s Generic Issue Data functionality to display the Fortify scan results alongside the other static analysis and test results. Doing this both centralizes all code quality reports — SSC is not used — and eliminates the need for Fortify SCA to be installed alongside Jenkins. Rather, it is run in its own Docker container.

In this blog, I will describe the Jenkins pipeline and how to generate the Fortify SCA scan results. In the next blog, I will describe how to convert the scan results and submit them to SonarQube for display and aggregation.

The reference code for this blog series is available here: https://github.com/justin-coveros/fortify-sonar-translate

A Few Assumptions

There are a few assumptions that will be made to simplify the example presented below. Because Fortify SCA is a licensed product, the first requirement is that you have a valid license and the Fortify SCA installation package. The reference code uses version 17.10, but other versions can be used. The second is that you are scanning a Maven-based Java application. This only matters because as you will see, the custom-built Docker image leverages the Fortify Maven plugin to perform the scan. This is certainly not a requirement for the translation from Fortify results to SonarQube generic data, but is the means in which the Fortify results are initially generated.

The third assumption is that you have Jenkins installed and capable of using the docker global variable in order to launch docker containers as part of the pipeline execution. Further, you have SonarQube 7 or higher installed, and Jenkins is configured properly with the necessary authorization to create SonarQube project data. This initial setup is beyond the scope of this blog, and is only necessary if attempting to implement the described example. The code examples are used to explain the flow of the process, and are not guaranteed to work in your environment without minimal modificiations.

The fourth assumption is that not every Jenkins plugin used in this blog will be defined or specifically mentioned. Particularly if not specifically required, or relavent to the intended focus of this blog. Pipeline steps such as readMavenPom() are made available by optional Jenkins plugins not described here.

Jenkins Pipeline

At a high level, the Jenkins pipeline that we will be using is the following:

node {
  stage('Checkout Code') {
    checkout scm
  stage('Fortify Scan') {
  stage('Translate Results') {
  stage('Sonarqube Analysis') {

The pipeline is organized into four stages, with the specific implementation details separated into distinct functions.

First the source code under test is checked out into the Jenkins build workspace. For this example, the specific details of the code repository — url, authentication, branch etc — are pulled from the Jenkins job configuration (not detailed here). These job configuration details are used by the checkout scm step.

Second, Fortify SCA scans the source code, generating an FPR and CSV report. These files are used as input for the next stage, which converts the CSV file into a JSON format required by SonarQube. The last stage submits the Fortify SCA results alongside the other SonarQube scan results.

Fortify Scan Stage

Building the Image

As stated in the overview, one of the assumptions for this example is that we are testing a Java 8 Maven application. Therefore, the base image that we will start with is the maven:3.6.2-jdk-8 image. From this base, Fortify SCA will be installed and configured. This image will be used as part of the Jenkins pipeline, to perform the Fortify SCA scan. This example uses Fortify SCA 17.10 as an example, but other versions may be used.

FROM maven:3.6.2-jdk-8


ENV MAVEN_HOME=/usr/share/maven
ENV JAVA_HOME=/usr/local/jdk-8

RUN groupadd -g ${FORTIFY_ID:-1000} fortify \
  && useradd -u ${FORTIFY_ID:-1000} -g fortify -d /home/fortify fortify

WORKDIR /home/fortify/tmp

COPY fortify.license ./
COPY HPE_Security_Fortify_SCA_and_Apps_17.10_linux_x64.run* ./

RUN chown -R fortify:fortify /home/fortify \
  && chown -R fortify:fortify /usr/share/maven/ref/ \
  && chmod u+x HPE_Security_Fortify_SCA_and_Apps_17.10_linux_x64.run

USER fortify
ENV HOME /home/fortify

RUN ./HPE_Security_Fortify_SCA_and_Apps_17.10_linux_x64.run --mode unattended \
  && tar -xzf /home/fortify/HPE_Security/Fortify_SCA_and_Apps_17.10/plugins/maven/maven-plugin-bin.tar.gz -C ./

ENV PATH "/home/fortify/HPE_Security/Fortify_SCA_and_Apps_17.10/bin:${PATH}"
ENV MAVEN_CONFIG /home/fortify/.m2

RUN chmod u+x install.sh \
  && ./install.sh \
  && fortifyupdate


RUN rm -rf /home/fortify/tmp

Essentially, a fortify user is created and assigned the same UID and GID that Jenkins will use to execute commands with the docker plugin (the Jenkins node user). Beyond being best practice to run containers as a non-root user, creating a container user with the same UID and GID that Jenkins will use will take care of any potential permission issues with command execution and persisted files.

The license and installation script are then copied into the container and — as the fortify user — Fortify is installed and the maven plugin extracted into the user’s home directory. The fortify binaries are added to the fortify user’s path, the rule set updated, and then the temporary files removed.

The image — tagged as maven-fortify:1.0 — is now ready to be used as part of a Jenkins pipeline. Other optional steps — not shown here — include adding custom rules, or maven settings as part of the build process. Modify this Dockerfile as needs dictate.

Using the Image in the Pipeline

In the Jenkins pipeline, the Fortify Scan stage executes the function described below.

def fortifyScan() {
  def pom = readMavenPom()
  def plugin = "com.hpe.security.fortify.maven.plugin:sca-maven-plugin:17.10"
  String filename = "${pom.getArtifactId}-${pom.getVersion()}"
  docker.image('maven-fortify:1.0').inside() {
    sh "mvn -B ${plugin}:clean -Duser.home=/home/jenkins"
    sh "mvn -B ${plugin}:translate -DskipTests -Duser.home=/home/jenkins "
    sh "mvn -B ${plugin}:scan -Dfortify.sca.Xmx=8G -Duser.home=/home/jenkins"
    sh "FPRUtility -information -listIssues -project target/fortify/${filename}.fpr -outputFormat CSV -f target/fortify/fprcsv.csv"

In understanding how Jenkins executes this function, there are a couple of things to consider when using docker.image().inside(). First, Jenkins will mount the workspace directory onto the container, and set that as the working directory for executing the commands defined within the inside closure. So the source code that is checked out in the prior stage will be accessible to commands run in this stage.

Second, Jenkins will execute the commands as the same user running on the node. For this example, the Jenkins agent is running in a Docker container as the user 1000:1000. When Jenkins runs the commands defined within the inside() {} closure, it will use docker exec -u 1000:1000 .... Because we created a user in the maven-fortify:1.0 image with that same UID and GID, and made this the owner of the Fortify binaries and Maven plugins, everything will work as expected.

Third, the sh steps enclosed by {} will be executed on the container, relative to the mapped workspace directory. If the Jenkins node — that is executing the pipeline — is itself a running Docker container, Jenkins will copy its mounted volumes to the started container. This enables generated artifacts to be persisted back to the host. The FPR file created in this stage by the maven-fortify:1.0 container will be accessible in the next stage by the fortify-to-sonarqube:1.0 container.

Using the Fortify maven plugin, the clean, translate, and scan Maven goals are run to generate the FPR file, which is converted to a CSV file by FPRUtility. Both the FPR and CSV files are persisted in the Jenkins job workspace because of the automatic volume mapping described above. The CSV file — target/fortify/fprcsv.csv — will be the input for the next stage.

Next Steps

With the Fortify files persisted in the Jenkins workspace, additional steps could include archiving these artifacts, emailing them as attachments, or uploading them into an external repository such as Nexus or Artifactory.

The second part doesn’t focus on these optional paths. Instead, it describes the translateResults and sonarqubeScan pipeline steps, detailing how to display the scan results in SonarQube.

Part Two: https://www.coveros.com/fortify-to-sonarqube-part-two/

One thought to “Fortify to SonarQube: Part One”

Leave a comment

Your email address will not be published. Required fields are marked *