Welcome back to this new article dedicated to the DevOps world!
Recently, I found myself needing to configure a DevOps environment from scratch, and among the various tasks was integrating SonarQube with GitLab.
Sonarqube is a very useful and popular tool capable of performing source code analysis and finding bugs, security issues (OWASP), and more.
In my pipeline design, analysis with this tool is a mandatory step, both to maintain high code quality and to detect any issues introduced with recent commits. Let’s see how to configure everything.
Requirements
- Pre-installed instances – I assume you already have GitLab and SonarQube instances installed. The screenshots in this article refer to a self-hosted GitLab and SonarQube Community Edition (running with Docker).
- Active GitLab runners – I covered this in this tutorial. Runners are the processes that allow pipeline execution. If you have a self-hosted instance, you’ll need to configure them; if you have GitLab Cloud, they’re already configured.
Objective
Execute SonarQube analysis during GitLab CI/CD pipeline execution to detect potential issues.
Configuration steps
1. Gitlab – Creating a technical user
I always recommend creating a technical user, mainly because we don’t tie any company employee to a business function (what happens if they leave and your pipelines stop working?).
The first step is to create a new user with an email shared among multiple team members. Connect to GitLab with an administrator account and click Admin → Users → New users. Let’s call it “sonarqube_user” and leave the access_level as “Regular“.

Now that the user is created, log in with its credentials and click “Access tokens” in the left menu. We need to create an access token to use in SonarQube. Click “Add new token”.

2. Sonarqube – GitLab integration
Let’s move to SonarQube to integrate the two tools using the user token created earlier. Click “Administration” in the top menu.

Then click DevOps Platform Integrations → GitLab.

Click “Create Configuration”, entering a name, your GitLab URL (e.g., https://gitlab.yourdomain.com/api/v4), and the user token. Save and click the “Check Configuration” button to test that the integration is correct.

3. Sonarqube – Project import
At this point, the two environments are connected, but we still need a few more steps to complete the setup. In SonarQube, each project must have its own configuration. From the top menu, click “Projects” to return to the home page, then Create project → From GitLab.

On the page that appears, we’ll see a list of all projects on GitLab. Selecting them will allow us to import them.

In the next step, we’ll be asked what to consider “new code”: that produced with each commit, the sum of code after N days, based on versions, etc. The choice is yours; if in doubt, the best option is “use global settings,” which considers code modified with each commit as new.
On the next page, we’ll be asked which tool we use to run our pipeline, and GitLab will help us configure the environment. We’ll be provided with a token and project key: the first will be used to authenticate with SonarQube, the second will link the analysis to the correct project.
BE CAREFUL
There are some special cases where it’s better to select the “Locally” option instead of “With GitLab CI.” In this step, only the project token will be generated without any environment details. I had to follow this step because my DevOps machine has ARM architecture and the Docker analysis image doesn’t support it (at the moment). To perform the analysis, I simply executed the command via node in the pipeline.
4. Gitlab – Creating environment variables
Whether your repository has one project or many, it’s good practice to create variables for use in your pipelines, separating configuration values from pipelines.
The first environment variable we’ll configure will contain our SonarQube address, and we’ll create it as an instance variable: any project will see it and can use it in its pipelines. All pipelines will refer to this value, and if the address changes in the future, you’ll only need to update the variable instead of all pipelines.
Go to Admin → Settings → CI/CD and expand the “Variables” section. Click the “Add variable” button.

Remove the “Protected value” flag (to make the value visible to any pipeline/branch) and leave visibility as “Visible.” Enter SONAR_HOST_URL as the key and the SonarQube address as the value, without a trailing slash.

Now we need to create another variable with the project token, but we’ll create it at the project level instead of the instance level. In GitLab, go to the project we just imported into SonarQube and select Settings → CI/CD from the side menu.

The screen is similar to the previous one (with a few more options). Expand “variables” and add a new variable, making sure to select “Masked” in the Visibility section.

Also add the third and final variable (at project level) with the SonarQube project-key (let’s call it SONAR_PROJECT_KEY).
The environment configuration is complete—we can now add an analysis step to our pipeline!
5. Gitlab – Pipeline configuration
Below are example pipelines that execute SonarQube analysis. If you don’t have a pipeline, you can use this as a starting point; if you already have one, you can take inspiration to improve it.
.NET Core
#Docker image
image: mcr.microsoft.com/dotnet/sdk:9.0
#Stage of pipeline
stages:
- sonar-analysis
- build
#Global variables
variables:
DOTNET_SKIP_FIRST_TIME_EXPERIENCE: "true"
DOTNET_CLI_TELEMETRY_OPTOUT: "true"
PROJECT_PATH: "<yout-project-name>" # --> Name of your project
#Defines the location of the analysis task cache
SONAR_USER_HOME: "${CI_PROJECT_DIR}/.sonar"
#Shallow clone
GIT_DEPTH: "0"
# SonarQube
sonar-analysis:
stage: sonar-analysis
tags:
- build_dotnet
cache:
policy: pull-push
key: "sonar-cache-$CI_COMMIT_REF_SLUG"
paths:
- "${SONAR_USER_HOME}/cache"
- sonar-scanner/
script:
- "dotnet tool install --global dotnet-sonarscanner"
- "export PATH=\"$PATH:$HOME/.dotnet/tools\""
- "cd $PROJECT_PATH"
- "dotnet sonarscanner begin /k:\"${SONAR_PROJECT_KEY}\" /d:sonar.token=\"$SONAR_TOKEN\" /d:\"sonar.host.url=$SONAR_HOST_URL\" "
- "dotnet build --configuration Release"
- "dotnet sonarscanner end /d:sonar.token=\"$SONAR_TOKEN\""
allow_failure: false
build:
stage: build
tags:
- build_dotnet
script:
- echo "Start building.."
- dotnet --version
- cd $PROJECT_PATH
- dotnet restore
- dotnet build --configuration Release --no-restore
- echo ".. building completed!"
artifacts:
paths:
- $PROJECT_PATH/bin/Release/
expire_in: 1 hour
Angular
For Angular projects, you’ll need to add a file called “sonar-project.properties” with this content:
sonar.projectName=Your project name
sonar.sources.inclusions=**/*.ts,**/*.js,**/*.html,**/*.css,**/*.scss
sonar.test.inclusions=**/*.spec.ts,**/*.test.ts,**/*.e2e-spec.ts
sonar.exclusions=**/node_modules/**,**/dist/**,**/coverage/**,**/*.spec.ts,**/*.test.ts,**/*.e2e-spec.tsEvery time SonarQube performs analysis, it will look for a file with this name and read these parameters as configuration.
ATTENTION GitLab variables are not substituted in this file.
The pipeline will be slightly more complex than the .NET one. In the first phase, we’ll download packages and dependencies; in the second, we’ll perform analysis; and in the third, we’ll compile the project.
stages:
- setup
- analysis
- build
##############################################################
# Cahce to improve performance
##############################################################
.node_cache:
cache:
- key:
files:
- package-lock.json
prefix: "node-modules-${CI_COMMIT_REF_SLUG}"
paths:
- node_modules/
- .npm/
policy: pull-push
- key:
files:
- package-lock.json
prefix: "node-modules-main"
paths:
- node_modules/
- .npm/
policy: pull
- key: "${CACHE_FALLBACK_KEY}"
paths:
- node_modules/
- .npm/
policy: pull
##############################################################
# Init repo dependencies
##############################################################
install-dependencies:
stage: setup
tags:
- build_angular
extends: .node_cache
script:
- echo "Setup dipendencies completed"
- npm list --depth=0 2>/dev/null || true
cache:
- key:
files:
- package-lock.json
prefix: "node-modules-${CI_COMMIT_REF_SLUG}"
paths:
- node_modules/
- .npm/
policy: push
artifacts:
paths:
- node_modules/
expire_in: 1 hour
reports:
dotenv: build.env
rules:
- allow_failure: false
##############################################################
# Sonar scan - this work here/on your machine/ARM server
##############################################################
sonarqube-analysis:
stage: analysis
tags:
- build_angular
extends: .setup_node
script:
# Check configuration file
- |
if [ ! -f "sonar-project.properties" ]; then
echo "sonar-project.properties missing"
exit 1
fi
if [ -z "$SONAR_HOST_URL" ] || [ -z "$SONAR_TOKEN" ] || [ -z "$SONAR_PROJECT_KEY" ]; then
echo "$SONAR_HOST_URL or $SONAR_TOKEN or $SONAR_PROJECT_KEY missing - skip analysis"
exit 0
fi
# Install and execute SonarQube scanner
- npm install -g sonarqube-scanner@latest --no-audit --no-fund
- |
sonar-scanner \
-Dsonar.host.url="${SONAR_HOST_URL}" \
-Dsonar.token="${SONAR_TOKEN}" \
-Dsonar.projectKey="${SONAR_PROJECT_KEY}"
- echo "Sonar scan completed"
dependencies:
- install-dependencies
allow_failure: true
##############################################################
# Build project
##############################################################
build-project:
stage: build
tags:
- build_angular
extends: .setup_node
variables:
BUILD_CONFIGURATION: "production"
script:
- echo "Building Angular project..."
- |
npm run build -- \
--configuration=${BUILD_CONFIGURATION} \
--optimization=true \
--aot=true \
--stats-json=true
echo "Build completed!"
dependencies:
- install-dependencies
needs:
- job: install-dependencies
optional: falseConclusion
In this article, we’ve seen how to integrate SonarQube into a GitLab pipeline. Automating the monitoring of our projects is both fundamental and strategic. Not only do these tools provide alerts about potential bugs or security issues (it would be worse to discover them when the software is in production), but they also provide advice on how to improve what we’ve written.
Whether we like it or not, IT is constantly evolving, and as developers, it’s necessary to keep pace to be always prepared and avoid accumulating “technical debt” when writing our applications. I don’t expect everyone to share this viewpoint—in my experience, most people don’t want to stay updated for convenience reasons. However, sometimes it takes very little: if each of us learned even one small improvement every day, we would become better developers every day.
I hope this article has added something to your technical toolkit. See you next time!