Jekyll2023-11-17T14:55:02+00:00https://www.testspace.com/feed.xmlwww.testspace.comcontent related to website, images, etc.A JUnit XML reporter for Cypress that includes screenshots, videos, and logs.2023-09-27T00:00:00+00:002023-09-27T00:00:00+00:00https://www.testspace.com/blog/cypress-xml-reporter-including-screenshots-videos-and-logs<p>Cypress is a powerful and easy-to-use E2E testing framework that offers several advantages over other frameworks. Cypress also integrates with popular CI/CD tools, such as GitHub Actions, CircleCI, Buildkite, etc. This facilitates automating system testing workflows.</p>
<p>Testspace has built-in integrations with <a href="https://help.testspace.com/integrations#cicd-systems">CI/CD tools</a> and provides an alternative to the <a href="https://www.cypress.io/">Cypress.io</a> dashboard for visualizing test results and other important metrics.</p>
<p><img src="../assets/images/blog/cypress-current-results.png" alt="Current Test Results" title="Current Test Results" /></p>
<h2 id="cypress-xml-reporter">Cypress XML Reporter</h2>
<p>Although there is a default <a href="https://github.com/michaelleeallen/mocha-junit-reporter">junit reporter</a> built into Cypress, it is very basic and does not capture important failure details like <code class="language-plaintext highlighter-rouge">screenshots</code> and <code class="language-plaintext highlighter-rouge">logs</code>. Thus, we decided to implement a new <a href="https://github.com/testspace-com/cypress-xml-reporter">Cypress XML Reporter</a> that would supports this type of information, including <code class="language-plaintext highlighter-rouge">videos</code> of failing tests. These artifacts are automatically attached to <code class="language-plaintext highlighter-rouge">suites</code> and managed in Testspace.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>npm install cypress-xml-reporter --sav-dev
</code></pre></div></div>
<p>To publish content simply “push” file(s) with the Testspace CLI.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>testspace results/**/*.xml{cypress/e2e}
</code></pre></div></div>
<h2 id="example-repo">Example Repo</h2>
<p>This <a href="https://github.com/testspace-com/example-cypress.xml.reporter">example repo</a> demonstrates how the <strong>Cypress Test Runner</strong>, the <a href="https://github.com/testspace-com/cypress-xml-reporter">Cypress XML Reporter</a>, and <a href="https://testspace.com">Testspace</a> can work together.</p>
<p>In this example, there are three use cases supported by Testspace that are being demonstrated:</p>
<ol>
<li>The <code class="language-plaintext highlighter-rouge">logs</code> generated by the <em>cypress terminal report</em></li>
<li>The <code class="language-plaintext highlighter-rouge">video</code> capturing the execution of the <em>tody.cy.js</em> file</li>
<li>The <code class="language-plaintext highlighter-rouge">screenshot</code> of the test case - <strong>displays two todo items by default</strong> - that failed (see below)</li>
</ol>
<p>The following test suite can be viewed <a href="https://testspace-com.testspace.com/projects/testspace-com:example-cypress.xml.reporter/spaces/main/current/1-getting-started">HERE</a>:</p>
<p><img src="../assets/images/blog/cypress-failing-test-suite.png" alt="Failing Test Suite" title="Failing Test Suite" /></p>
<h2 id="get-setup-in-minutes">Get setup in minutes!</h2>
<p>Try <a href="/pricing">Testspace</a> risk-free. No credit card is required.</p>
<p>Have questions? <a href="mailto:contact@testspace.com">Contact us.</a></p>Mark UndersethCypress is a powerful and easy-to-use E2E testing framework that offers several advantages over other frameworks. Cypress also integrates with popular CI/CD tools, such as GitHub Actions, CircleCI, Buildkite, etc. This facilitates automating system testing workflows.Capture screenshots on test failures using Jest & Puppeteer2023-02-09T00:00:00+00:002023-02-09T00:00:00+00:00https://www.testspace.com/blog/screenshot-on-test-failure-using-jest-and-puppeteer<p>For our Testspace UI testing, we use the <a href="https://jestjs.io/">Jest</a> test framework and <a href="https://pptr.dev/">Puppeteer</a> for controlling the browser. One challenge we experienced when executing “headless” automated tests using our CI system was having no visibility of the active screen on a test failure. It soon became difficult to triage failures when we deployed more extensive testing scenarios.</p>
<p>The following is an example test results record maintained on the Testspace server.</p>
<p><img src="../assets/images/blog/jest-puppeteer-failing-results.png" alt="Test Results published" title="Test Results published" /></p>
<p>Because Testspace supports <a href="https://help.testspace.com/publish/push-data-results#annotations">annotating</a> tests with arbitrary files, including images, we wanted to include a captured screenshot when a test fails. To accomplish this we extended the Jest Puppeteer environment (see below).</p>
<p>The following image provides an example of an attached screenshot with a failed test case:</p>
<p><img src="../assets/images/blog/jest-puppeteer-failing-suite.png" alt="Console, Image, Error Message, and Call Stack" title="Console, Image, Error Message, and Call Stack" /></p>
<p>When clicking on the <code class="language-plaintext highlighter-rouge">"should_match_Testspace_demo_project_.png"</code> link, the following image will be rendered.
The image is a “diference” image of a “visual test” failing, expecting the screenshots to match.</p>
<p><img src="../assets/images/blog/jest-puppeteer-failing-suite-image.png" alt="Difference Image" title="Difference Image" /></p>
<blockquote>
<p><strong>NOTE</strong>. Refer to this example <a href="https://github.com/testspace-com/example-jest.puppeteer.screenshots">REPO</a> for more details. And the corresponding Testspace <a href="https://testspace-com.testspace.com/projects/67670/spaces">PROJECT</a>.</p>
</blockquote>
<p>To support capturing and publishing screenshots associated with test failures, we were required to extend the puppeteer and jest environment.</p>
<h2 id="extending-environment">Extending Environment</h2>
<p>To enable capturing images automatically on test failures, the Jest Puppeteer Environment was required to be extended. Two use cases were covered:</p>
<ol>
<li>Capture a screenshot of a test failure</li>
<li>Capture an <em>image difference</em> when a visual test fails using the <a href="https://github.com/americanexpress/jest-image-snapshot">jest image snapshot</a> package</li>
</ol>
<p>The basic concept is to <em>extend</em> the <code class="language-plaintext highlighter-rouge">handleTestEvent</code>, specifically on a test failure. Note, refer to the <a href="https://github.com/argos-ci/jest-puppeteer#extend-puppeteerenvironment">Jest Puppeteer Environment</a> article for specifics.</p>
<ul>
<li>If an <em>image diff</em>` has been generated capture it for publishing</li>
<li>If not an image diff, take a snapshot of the <em>current screen</em></li>
</ul>
<p>Images generated are moved into a specific folder … in this example it is called <code class="language-plaintext highlighter-rouge">screenshots</code>.
Checkout the example repo here - <a href="https://github.com/testspace-com/example-jest.puppeteer.screenshots">example-jest.puppeteer.screenshots</a>.</p>
<p>Refer to <code class="language-plaintext highlighter-rouge">jest-custom-environment.js</code> for specifics. Also refer to the <code class="language-plaintext highlighter-rouge">jest-custom-global-setup.js</code> for global setup requirements.</p>
<h2 id="publishing-results">Publishing Results</h2>
<p>To support attaching screenshots/images, a Testspace <a href="https://help.testspace.com/publish/push-data-results#content-list">content list file</a> is used. When a test fails, an image is created (or used if auto-generated) and an entry is added to the content list file. The image name is based on the test case name.</p>
<p>An example content list file entry (<code class="language-plaintext highlighter-rouge">screenshots-list.txt</code>):</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>"[Suite Name]+./screenshots/test case name.jpeg{screenshot}"
</code></pre></div></div>
<p>When publishing to Testspace the following command line is used:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>testspace junit.xml @./screenshots-list.txt
</code></pre></div></div>
<h2 id="get-setup-in-minutes">Get setup in minutes!</h2>
<p>Try <a href="/pricing">Testspace</a> risk-free. No credit card is required.</p>
<p>Have questions? <a href="mailto:contact@testspace.com">Contact us.</a></p>Mark UndersethFor our Testspace UI testing, we use the Jest test framework and Puppeteer for controlling the browser. One challenge we experienced when executing “headless” automated tests using our CI system was having no visibility of the active screen on a test failure. It soon became difficult to triage failures when we deployed more extensive testing scenarios.Visualizing GitHub Automated Test Results2022-06-27T00:00:00+00:002022-06-27T00:00:00+00:00https://www.testspace.com/blog/visualizing-test-results-with-github-actions<p>Testspace has built-in integrations with GitHub Actions. To publish content simply “push” file(s) with the Testspace CLI. Test results, code coverage, and other artifacts can be published using a single command.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>testspace path/to/**/results*.xml coverage.xml ..
</code></pre></div></div>
<p>Testspace <strong>Automation Reporting</strong> includes support for</p>
<ul>
<li>branching</li>
<li>pull requests</li>
<li>forks</li>
<li>parallel jobs</li>
<li>and multiple workflows</li>
</ul>
<p>The GitHub’s Actions image below is from this <a href="https://github.com/testspace-com/hello.publish" target="_blank">Hello Publish</a> sample that uses pre-canned results generated from <em>multiple workflows</em> based on the same <em>commit id</em>.</p>
<p><img src="/assets/images/blog/github-actions.png" alt="GitHub Actions" title="GitHub Actions" /></p>
<p>The <a href="https://testspace-com.testspace.com/spaces/130525/current" target="_blank">test results</a>, generated from the separate workflows, are aggregated into a single record using <a href="https://help.testspace.com/publish/push-data-results#folders" target="_blank">folders</a> to organize the content.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>testspace "[${{ github.workflow }}]./testcontent/*.xml"
</code></pre></div></div>
<p><img src="/assets/images/blog/github-actions-integration.png" alt="GitHub Actions Integration" title="Benefits of GitHub Actions Integration" /></p>
<h3 id="benefits">Benefits</h3>
<p>Teams can <strong>visualize all their test results</strong> with a single dashboard providing history, metrics, and other information.</p>
<p>The Testspace <strong>Dashboard</strong> provides:</p>
<ul>
<li>Built-in <a href="https://help.testspace.com/dashboard/space-metrics" target="_blank">metrics/graphs</a></li>
<li>Extensive <a href="https://help.testspace.com/dashboard/space-failures" target="_blank">Failure Tracking Management</a></li>
<li>Automatic <a href="https://help.testspace.com/dashboard/space-failures#failure-state" target="_blank">Flaky analysis</a></li>
<li><a href="https://help.testspace.com/dashboard/project-insights" target="_blank">Insights</a> for process improvements</li>
</ul>
<p><strong>Check out the following</strong>:</p>
<ul>
<li><a href="https://demo.testspace.com/" target="_blank">LIVE DASHBOARD</a></li>
<li><a href="https://help.testspace.com/publish/overview" target="_blank">Documentation Overview</a> for pushing different types of test content to Testspace</li>
</ul>
<h2 id="integration">Integration</h2>
<p>There are two steps required to include Testspace within a workflow yml file:</p>
<ul>
<li>Include the <strong>setup-testspace</strong> <em>action</em></li>
<li>Push xml file(s) containing results with the <strong>testspace</strong> <em>client</em></li>
</ul>
<p>For more details refer to the <a href="https://help.testspace.com/getting.started" target="_blank">getting started</a> documentation.</p>
<h3 id="setup-action">Setup Action</h3>
<p>Testspace provides a <a href="https://github.com/marketplace/actions/testspace-setup-cli" target="_blank">setup cli action</a>, simplying the integration.</p>
<div class="language-yml highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="na">steps</span><span class="pi">:</span>
<span class="pi">-</span> <span class="na">uses</span><span class="pi">:</span> <span class="s">actions/checkout@v2</span> <span class="c1"># required "before" setup for checks to be included</span>
<span class="pi">-</span> <span class="na">name</span><span class="pi">:</span> <span class="s">Testspace client setup</span>
<span class="na">uses</span><span class="pi">:</span> <span class="s">testspace-com/setup-testspace@v1</span>
<span class="na">with</span><span class="pi">:</span>
<span class="na">domain</span><span class="pi">:</span> <span class="s">${{ github.repository_owner }}</span>
<span class="na">token</span><span class="pi">:</span> <span class="s">${{ secrets.TESTSPACE_TOKEN }}</span> <span class="c1"># optional, only required for private repos</span>
<span class="s">..</span>
</code></pre></div></div>
<h3 id="push-xml-files">Push XML File(s)</h3>
<p>Files – output from the Continuous Integration process – are pushed to the Testspace Server using a simple command line client.</p>
<p>For details of the different test content supported refer to the <a href="https://help.testspace.com/publish/push-data-results#file-content" target="_blank">file content</a> documentation.</p>
<div class="language-yml highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="na">steps</span><span class="pi">:</span>
<span class="pi">-</span> <span class="na">name</span><span class="pi">:</span> <span class="s">Testspace push test content</span>
<span class="na">run</span><span class="pi">:</span> <span class="s">testspace "*.xml"</span>
</code></pre></div></div>
<h3 id="example-using-matrix">Example Using Matrix</h3>
<p><a href="https://github.com/testspace-com/hello.publish" target="_blank">This Sample</a> that uses pre-canned (fake) results leveraging <code class="language-plaintext highlighter-rouge">multiple workflows</code>. The workflow below can be found <a href="https://github.com/testspace-com/hello.publish/blob/main/.github/workflows/matrix.yml" target="_blank">here</a>.</p>
<div class="language-yml highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="na">name</span><span class="pi">:</span> <span class="s">Matrix</span>
<span class="na">on</span><span class="pi">:</span>
<span class="na">push</span><span class="pi">:</span>
<span class="na">jobs</span><span class="pi">:</span>
<span class="na">test</span><span class="pi">:</span>
<span class="na">runs-on</span><span class="pi">:</span> <span class="s">${{ matrix.os }}</span>
<span class="na">strategy</span><span class="pi">:</span>
<span class="na">matrix</span><span class="pi">:</span>
<span class="na">os</span><span class="pi">:</span> <span class="pi">[</span><span class="nv">ubuntu-latest</span><span class="pi">,</span> <span class="nv">macos-latest</span><span class="pi">,</span> <span class="nv">windows-latest</span><span class="pi">]</span>
<span class="na">steps</span><span class="pi">:</span>
<span class="pi">-</span> <span class="na">uses</span><span class="pi">:</span> <span class="s">actions/checkout@v2</span>
<span class="pi">-</span> <span class="na">name</span><span class="pi">:</span> <span class="s">Testspace client setup</span>
<span class="na">uses</span><span class="pi">:</span> <span class="s">testspace-com/setup-testspace@v1</span>
<span class="na">with</span><span class="pi">:</span>
<span class="na">domain</span><span class="pi">:</span> <span class="s">${{github.repository_owner}}</span>
<span class="pi">-</span> <span class="na">name</span><span class="pi">:</span> <span class="s">Testspace push test content</span>
<span class="na">run</span><span class="pi">:</span> <span class="s">testspace "[${{ github.workflow }} / ${{ matrix.os}}]./testcontent/*.xml"</span>
</code></pre></div></div>
<h2 id="get-setup-in-minutes">Get setup in minutes!</h2>
<p>Try <a href="/pricing">Testspace</a> risk-free. No credit card is required.</p>
<p>Have questions? <a href="mailto:contact@testspace.com">Contact us.</a></p>Mark UndersethTestspace has built-in integrations with GitHub Actions. To publish content simply “push” file(s) with the Testspace CLI. Test results, code coverage, and other artifacts can be published using a single command.Implementing Manual Tests using GitHub Repositories2022-01-15T00:00:00+00:002022-01-15T00:00:00+00:00https://www.testspace.com/blog/github-test-case-management<p>Testspace automatically discovers manual tests that are captured using “simple” plain text <a href="https://docs.github.com/en/get-started/writing-on-github/getting-started-with-writing-and-formatting-on-github/basic-writing-and-formatting-syntax" target="_blank">Markdown</a> files and are committed to a <strong>repository</strong>. This approach follows the same process as software development - version control, pull requests for reviews, etc. The files are rendered as test instructions, allowing human testers to execute and provide status.</p>
<p>Optionally, for advanced implementation, a <strong>liquid template language</strong> can be used. This enables easier implementation of more complex functionality - variables, subroutines, and conditional logic are available, facilitating reuse and data-driven test cases.</p>
<p>There is also built-in support for calling <strong>remote serverless functions</strong> used for provisioning the state of the software under test. Users can implement tests that reduce manual execution time by automating tedious setups.</p>
<p>Implementing and executing manual tests includes:</p>
<ol>
<li>Leveraging <strong>Repository Functionality</strong> for test development</li>
<li>Support for the <strong>Liquid Template Language</strong></li>
<li>Built-in <strong>Automated Fixturing</strong> functionality</li>
</ol>
<h2 id="repository-functionality">Repository Functionality</h2>
<p>Why is integrating with GitHub’s repositories so useful for manual testing? Traditional <em>Test Management</em> applications use <em>proprietary user interfaces</em> to capture the test case instructions. The test instructions thus become difficult to share and review with other team members, especially developers.</p>
<p>We have taken a different approach by using plain text files with support for the lightweight markup language called <a href="https://docs.github.com/en/get-started/writing-on-github/getting-started-with-writing-and-formatting-on-github/basic-writing-and-formatting-syntax" target="_blank">Markdown</a>. These human-readable text files are stored and maintained in your <strong>repository</strong>. This model follows the developer workflow, enabling manual testing to <em>supplement</em> existing automated testing and facilitate developers participating in the manual test implementation “review” process.</p>
<p>This model includes source <code class="language-plaintext highlighter-rouge">versioning</code>, <code class="language-plaintext highlighter-rouge">branching</code>, and <code class="language-plaintext highlighter-rouge">pull requests</code> functionality supported by GitHub.</p>
<ul>
<li>Change (commits) tracking</li>
<li>Pull Request for reviews</li>
<li>Developer participation using plain-text instructions</li>
</ul>
<p>Thus, both automated and manual test files are stored in a repository.</p>
<p><img src="/assets/images/blog/manual-test-as-code-repo.png" alt="Manual Tests Repo" title="Manual Tests Repo" /></p>
<p>The “Hello Manual” sample can be found <a href="https://github.com/testspace-com/hello.manual" target="_blank">here</a>.</p>
<p><strong>IMPORTANT TIP 💡</strong></p>
<blockquote>
<p>Use the <a href="https://docs.github.com/en/codespaces/the-githubdev-web-based-editor" target="_blank">GITHUB.DEV</a> web-based editor for making changes that runs “entirely in your browser”. No console, no installation, just a great editor! <strong>Just press</strong> <strong><code class="language-plaintext highlighter-rouge">.</code></strong> while browsing your repository.</p>
</blockquote>
<p><img src="/assets/images/blog/manual-test-as-code-repo-source.png" alt="Manual Tests Repo Source" title="Manual Tests Repo Source" /></p>
<p>Testspace automatically discovers test files in a repository and renders them for human testers to <em>execute</em> and provide <em>status</em>.</p>
<p><img src="/assets/images/blog/manual-test-as-code-repo-spec.png" alt="Manual Tests Repo Spec" title="Manual Tests Repo Spec" /></p>
<blockquote>
<p>Testspace <strong>automatically discovers</strong> tests that are committed to a repository, rendering the files as manual test instructions.</p>
</blockquote>
<h2 id="liquid-template-language">Liquid Template Language</h2>
<p>Testspace supports the template language called <a href="https://shopify.github.io/liquid/">Liquid</a>. Test files are handled as template files, meaning they get preprocessed before being rendered. This functionality enables a test file to use variables, include files (subroutines) along with passing parameters, and even the ability to implement conditional logic.</p>
<p>Liquid is an open-source template language created by Shopify. Jekyll, and GitHub pages, also support the Liquid template language. It is a simple yet <em>comprehensive language</em> that enables very powerful capability.</p>
<p>The following is an example of a metadata block (front-matter section) located at the top of the file. In this example, there is a list of OS systems.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>---
testspace:
title: OS Systems
matrix: # test different OS systems
- name: Windows
reqs: "[Windows details](https://staging7.newco.com/windows)"
- name: Linux
reqs: "[Linux details](https://staging7.newco.com/linus)"
---
# Example Test
blah blah ..
</code></pre></div></div>
<p>Within the test body a loop can be leveraged:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>{% for os in spec.matrix %}
## Test {{ os.name }}
* check on requirements: {{ os.reqs }}
{% endfor %}
</code></pre></div></div>
<p>The following is a list of core functionality supported by Liquid:</p>
<ul>
<li>Variables</li>
<li>Conditional logic</li>
<li>Parameters</li>
<li>Include files</li>
</ul>
<p>The concept is to create powerful tests <em>“without repeating yourself”</em>.</p>
<p><strong>IMPORTANT TIP 💡</strong></p>
<blockquote>
<p>Refer to the <a href="https://help.testspace.com/manual/desktop-preview" target="_blank">Desktop Preview</a> documentation for reviewing tests on your desktop leveraging Jekyll.</p>
</blockquote>
<h2 id="automated-fixturing">Automated Fixturing</h2>
<p>An <a href="https://help.testspace.com/manual/implementation-spec#automated-fixtures" target="_blank">automated test fixture</a> is a serverless function that uses a GitHub Workflow or an AWS Lambda to host it. Automated fixtures could be run before and/or after the manual test instructions to setup/teardown a specific test environment.</p>
<p>Testspace supports integrating automation with manual testing. The following are some key benefits:</p>
<ul>
<li>Reduce manual execution time. Fixturing can leverage automation for tedious and redundant setup/teardown requirements versus human execution.</li>
<li>Deploy “hybrid testing” leveraging automation and human observations.</li>
<li>Minimize IT setup for Human testers. All testing, including automation, is executed using a web browser.</li>
</ul>
<blockquote>
<p>Testspace fixtures enable testing that leverages a hybrid of automation and manual verification.</p>
</blockquote>
<h3 id="example">Example</h3>
<p>The header block to define the fixture.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>---
testspace:
title: Setup database
before:
name: github::fixture
payload:
a: one
b: two
---
# Check DB
This test verifies state changes based on database content.
## Check One
..
## Check Two
..
</code></pre></div></div>
<p>The GitHub Action that gets invoked (<code class="language-plaintext highlighter-rouge">.github/workflows/testspace.yml</code>):</p>
<div class="language-yml highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="na">name</span><span class="pi">:</span> <span class="s">Testspace</span>
<span class="na">on</span><span class="pi">:</span>
<span class="na">workflow_dispatch</span><span class="pi">:</span>
<span class="na">inputs</span><span class="pi">:</span>
<span class="na">name</span><span class="pi">:</span>
<span class="na">description</span><span class="pi">:</span> <span class="s1">'</span><span class="s">Function</span><span class="nv"> </span><span class="s">name'</span>
<span class="na">required</span><span class="pi">:</span> <span class="no">true</span>
<span class="na">payload</span><span class="pi">:</span>
<span class="na">description</span><span class="pi">:</span> <span class="s1">'</span><span class="s">Function</span><span class="nv"> </span><span class="s">input-payload'</span>
<span class="na">required</span><span class="pi">:</span> <span class="no">true</span>
<span class="na">context</span><span class="pi">:</span>
<span class="na">description</span><span class="pi">:</span> <span class="s1">'</span><span class="s">Function</span><span class="nv"> </span><span class="s">execution-context'</span>
<span class="na">required</span><span class="pi">:</span> <span class="no">true</span>
<span class="na">env</span><span class="pi">:</span>
<span class="na">IN_NAME</span><span class="pi">:</span> <span class="s">${{ inputs.name }}</span>
<span class="na">IN_PAYLOAD</span><span class="pi">:</span> <span class="s">${{ inputs.payload }}</span>
<span class="na">jobs</span><span class="pi">:</span>
<span class="na">fixture</span><span class="pi">:</span>
<span class="na">if</span><span class="pi">:</span> <span class="s">github.event.inputs.name == 'fixture'</span>
<span class="na">runs-on</span><span class="pi">:</span> <span class="s">ubuntu-latest</span>
<span class="na">steps</span><span class="pi">:</span>
<span class="pi">-</span> <span class="na">uses</span><span class="pi">:</span> <span class="s">actions/checkout@v1</span>
<span class="pi">-</span> <span class="na">uses</span><span class="pi">:</span> <span class="s">actions/setup-node@v1</span>
<span class="na">with</span><span class="pi">:</span>
<span class="na">node-version</span><span class="pi">:</span> <span class="s1">'</span><span class="s">12'</span>
<span class="pi">-</span> <span class="na">name</span><span class="pi">:</span> <span class="s">Call Script implementing setup fixture</span>
<span class="na">run</span><span class="pi">:</span> <span class="s">node ./.github/workflows/handler.js</span>
</code></pre></div></div>
<p><strong>NOTE</strong></p>
<blockquote>
<p>The <code class="language-plaintext highlighter-rouge">handler.js</code> script (aka function) is invoked along with the passed parameters from the test file. This script does all of the specific fixturing.</p>
</blockquote>
<p><strong>Checkout the following</strong>:</p>
<ul>
<li><a href="https://help.testspace.com/manual/overview" target="_blank">Doc Overview</a> for implementing and executing tests using Testspace</li>
<li><a href="https://github.com/testspace-com/testspace.getting-started" target="_blank">Getting Started</a> sample, which includes an <a href="https://help.testspace.com/manual/implementation-spec#automated-fixtures" target="_blank">automated fixture</a></li>
<li>A <a href="https://help.testspace.com/tutorial/setup">Walkthrough Tutorial</a></li>
</ul>
<h2 id="get-setup-in-minutes">Get setup in minutes!</h2>
<p>Try <a href="/pricing">Testspace</a> risk-free. No credit card is required.</p>
<p>Have questions? <a href="mailto:contact@testspace.com">Contact us.</a></p>Mark UndersethTestspace automatically discovers manual tests that are captured using “simple” plain text Markdown files and are committed to a repository. This approach follows the same process as software development - version control, pull requests for reviews, etc. The files are rendered as test instructions, allowing human testers to execute and provide status.Managing Software Quality Under Continuous Integration2021-09-18T00:00:00+00:002021-09-18T00:00:00+00:00https://www.testspace.com/blog/managing-software-quality-under-continuous-integration<p>Learn how Testspace simplifies managing software quality and release readiness, by leveraging important
information buried in your CI/Build systems.</p>
<p><strong>A common problem our customers face</strong> is managing software quality across multiple teams when they’re
using different CI tools and builds systems. All of the valuable information is buried within the various systems.</p>
<blockquote>
<p>Going to each CI system to extract software status from “build-centric content” is difficult and time-consuming.</p>
</blockquote>
<p>Testspace is a Test Management application designed for automation that connects to your testing workflow. Testspace aggregates data – test results, code coverage, static analysis, source code changes,
etc. – from all your CI systems, and performs analysis on the content including historical patterns.</p>
<p>Testspace transforms “<em>build-centric data</em>” into “<em>actionable software status</em>” enabling software companies to:</p>
<ul>
<li>Assess software readiness using Metrics and Analytics.</li>
<li>Monitor the effectiveness of the testing process with Workflow Indicators.</li>
<li>Optimize managing change side-effects/regressions with Failure Tracking.</li>
</ul>
<h1 id="assess-software-readiness-using-metrics-and-analytics">Assess Software Readiness using Metrics and Analytics</h1>
<p>Testspace aggregates and analyzes all CI data – build logs, test results, log files, static analysis,
code coverage, code changes, and more – from every build on every CI system. By pushing data, using
a simple client (Windows, Linux, and macOS supported), the process of aggregation is a simple addition to
any CI automation.</p>
<pre>
$ testspace test-results*.xml static-analysis.xml code-coverage.xml build.log ...
</pre>
<p>With Testspace you can monitor the health of every branch under every repository, all from a single view.</p>
<blockquote>
<p>Important metrics of any type can be included in the build status check.</p>
</blockquote>
<p><img src="/assets/images/blog/managing-continuous-integration-project.png" alt="Continuous Integration Testing Project" title="Review important Metrics" /></p>
<p>Add your own custom metrics with charts such as:</p>
<ul>
<li>Resource consumption - of memory, storage, peripherals, etc.</li>
<li>Timing information - such as latency, delays, and duration.</li>
<li>Measurements and counts - like attempts, retries, and completions.</li>
</ul>
<p><img src="/assets/images/blog/managing-continuous-integration-custom-metrics.png" alt="Continuous Integration Custom Metrics" title="Generate Custom Charts" /></p>
<p>Using thresholds and criteria, Managers define what it means for a software build to be deemed healthy.</p>
<readmore class="readmore">
<a class="btn-arrow" href="/blog/turning-log-file-data-into-actionable-metrics"> Read more </a>
</readmore>
<p>Interest in leveraging online Code Coverage providers such as <a href="https://codecov.io" target="_blank">Codecov.io</a> and <a href="https://coveralls.io" target="_blank">Coverall.io</a>?</p>
<readmore class="readmore">
<a class="btn-arrow" href="/blog/integration-with-3rd-party-code-coverage"> Read more </a>
</readmore>
<h1 id="monitor-the-effectiveness-of-the-testing-process-with-workflow-indicators">Monitor the Effectiveness of the Testing Process with Workflow Indicators</h1>
<p>Testspace Insights - workflow indicators that continuously assess your testing process - are derived from computational analysis of your regression patterns, code coverage, source change rates, and responsiveness to fixing issues.</p>
<p><img src="/assets/images/blog/managing-continuous-integration-insights.png" alt="Continuous Integration Test Analytics" title="Review Insights to improve process" /></p>
<p>Insights help answer important questions:</p>
<ul>
<li>Is the current testing providing value to the team?</li>
<li>Are regressions being addressed in a timely fashion?</li>
<li>Are changes currently in process improving the quality?</li>
</ul>
<readmore class="readmore">
<a class="btn-arrow" href="/blog/test-analytics-for-continuous-integration"> Read more </a>
</readmore>
<h1 id="optimize-managing-change-side-effectsregressions-with-failure-tracking">Optimize Managing Change Side-Effects/Regressions with Failure Tracking</h1>
<p>Developers can move beyond the flat, endless console to find and triage test failures.
In Testspace, your test results mirror your test folders and subfolders as defined in your
repository. When viewing results, a single click reduces a hierarchy of folders and suites
into a single view showing only the suites that contain test failures.</p>
<p>Suite views filter on failures by default and provide complete failure context: call stack,
timing information, and more. <code class="language-plaintext highlighter-rouge">Click the image</code> to see how.</p>
<gif>
<a href="/assets/images/blog/managing-test-regressions.gif">
<img align="left" border="0" alt="Tracking Failures" title="Click to see how" src="/assets/images/blog/managing-test-failures.png" />
</a>
</gif>
<p>Failing cases, blocked from being resolved by other dependencies, can be “Exempted” so only
new failures are reported.</p>
<readmore class="readmore">
<a class="btn-arrow" href="/blog/filtering-and-tracking-test-failures"> Read more </a>
</readmore>
<p><br /><br /><br /></p>
<h1 id="connect-testspace-to-your-online-version-control-service">Connect Testspace to Your Online Version Control Service</h1>
<p><strong>Using GitHub, GitLab, or Bitbucket?</strong></p>
<p>Testspace is integrated with all three. And with Testspace Connected Services, there are no plugins or extensions to install. Testspace account admins simply
connect to their Organizations (GitHub), Teams (Bitbucket), or Groups (GitLab). Once connected,
Testspace Projects and Spaces map to Repositories and their Branches automatically.</p>
<gh-readmore class="readmore with-img">
<div>
<img alt="Github logo" src="/assets/images/website/tooling/github.png" />
<h4>GitHub</h4>
<a class="btn-arrow" href="/blog/testspace-integration-with-github">
Read more
</a>
</div>
</gh-readmore>
<gl-readmore class="readmore with-img">
<div>
<img alt="Gitlab logo" src="/assets/images/website/tooling/gitlab.png" />
<h4>GitLab</h4>
<a class="btn-arrow" href="/blog/testspace-integration-with-gitlab">
Read more
</a>
</div>
</gl-readmore>
<bb-readmore class="readmore with-img">
<div>
<img alt="Bitbucket logo" src="/assets/images/website/tooling/bitbucket.png" />
<h4>Bitbucket</h4>
<a class="btn-arrow" href="/blog/testspace-integration-with-bitbucket">
Read more
</a>
</div>
</bb-readmore>
<h2 id="get-setup-in-minutes">Get setup in minutes!</h2>
<p>Try <a href="/pricing">Testspace</a> risk-free. No credit card is required.</p>
<p>Have questions? <a href="mailto:contact@testspace.com">Contact us.</a></p>Testspace TeamLearn how Testspace simplifies managing software quality and release readiness, by leveraging important information buried in your CI/Build systems.Turning Log File Data Into Actionable Metrics2021-03-06T00:00:00+00:002021-03-06T00:00:00+00:00https://www.testspace.com/blog/turning-log-file-data-into-actionable-metrics<p>A common practice for many development organizations, including ours, is to collect logs and data during automated testing. Whether this is done to provide context for failures or as secondary quality measurements, ancillary data can be a vital part of any <a href="https://en.wikipedia.org/wiki/Continuous_integration" target="_blank">Continuous Integration (CI)</a> process. The challenge is how to collect and use the data efficiently and effectively, one that doesn’t include the manual triage and dissemination of data after each run.</p>
<p><img src="/assets/images/blog/log-file-inspection.jpg" alt="Graphing log data produced during continuous integration" title="Use log data as part of your testing!" /></p>
<p>Testspace <a href="https://help.testspace.com/publish/push-data-results#custom-metrics" target="_blank">Custom Metrics</a> provides a simple approach to do just that; to collect and mine data into actionable metrics.</p>
<p>Based on your data, Custom Metrics – in many ways similar to standard metrics such as <em>test cases</em>, <em>code coverage</em>, and <em>static analysis</em> – are whatever’s important to you and your organization.</p>
<p>A few examples include:</p>
<ul>
<li><em>Resource consumption</em> - of memory, storage, peripherals, etc.</li>
<li><em>Timing information</em> - such as latency, delays, and duration.</li>
<li><em>Measurements and counts</em> - like attempts, retries, and completions.</li>
</ul>
<p>Metrics can be informational or used as criteria for health; a binary <code class="language-plaintext highlighter-rouge">pass|fail</code> indicator for the current set of published results.</p>
<p>The following cases illustrate simple examples of custom metrics in use.</p>
<h1 id="resource-utilization-during-continuous-integration">Resource Utilization during Continuous Integration</h1>
<p>Consider the following text file with usage values (high watermarks) produced during testing.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Resource Management Data
CPU - Max Utilization (%)
CPU 0 . . . . . . : 87
CPU 1 . . . . . . : 22
CPU 2 . . . . . . : 17
CPU 3 . . . . . . : 29
Memory - Max Utilization (%)
Physical . . . . : 55
Cache . . . . . . : 28
</code></pre></div></div>
<p>Let’s assume that it’s important for each CPU’s utilization to stay within a specific range. A value outside this range is an indication of incorrect (and perhaps harmful) software behavior, even if the tests are passing.</p>
<p>There are just two steps required: (1) add the values to a .csv file and (2) use the <code class="language-plaintext highlighter-rouge">Testspace client</code> to push the files to the <code class="language-plaintext highlighter-rouge">Testspace Server</code> along with the test results.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>testspace test_results.xml resource_data.txt{resource_data.csv}
</code></pre></div></div>
<p>Once you push the first set of files, <a href="https://help.testspace.com/dashboard/space-metrics#custom" target="_blank">metric charts</a> are automatically added to track and monitor the data over time. Expressions with variables, operators, and functions can be used to process the data. Thresholds based on your criteria can be used to determine if the metric is behaving as expected.</p>
<p>The charts below show an example where the utilization of one CPU fell below the threshold (see 17). The failed criteria would have triggered a notification at the time.</p>
<p><img src="/assets/images/blog/log-file-resource-metrics.png" alt="graphing log data from continuous integration" title="Testspace custom charts generated from log data" /></p>
<h1 id="peak-memory-consumption-during-automated-testing">Peak Memory Consumption During Automated Testing</h1>
<p>Another example is charting peak memory consumption from multiple pools of different size blocks</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>pool . . . . size/block . . max used
------------------------------------
small . . . 128 bytes . . . 17
medium . . . 512 bytes . . . 12
large . . . 1024 bytes . . . 7
x-large . . 2048 bytes . . . 2
</code></pre></div></div>
<p>The <em>Stacked Columns</em> graph is well suited for combining and comparing aggregate data. By defining a simple formula for the metric, each entry in the column is calculated as the block size multiplied by the maximum number used. The total consumption, as displayed on the metric badge <code class="language-plaintext highlighter-rouge">Peak Memory</code> is defined as the sum of all pools.</p>
<p>The chart below provides a clear view of how memory consumption has grown over time.</p>
<p><img src="/assets/images/blog/log-file-custom_metric_columns_chart.png" alt="stacked column chart for graphing log data" title="Testspace supports stacked column charts for graphing log data" /></p>
<p>Informational metrics like the above can help to manage and optimize valuable software resources over time.</p>
<h1 id="performance-metrics-from-automated-testing">Performance Metrics from Automated Testing.</h1>
<p>The 3rd example looks at timing data as shown in the file below. Let’s assume that the combined time must remain within some threshold. Anything beyond the threshold could lead to intermittent problems, or perhaps even the loss of competitive advantage.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Timing Measurements in "clock ticks."
Delay . . . . 1457 (time to schedule task)
Latency . . . 2391 (time from scheduling to execution)
Duration . . 9490 (time during execution)
</code></pre></div></div>
<p>From the badge, we can see that the metric has failed and that the total time displayed has exceeded the threshold. From the chart, it’s easy to see the cause was because of an increase in latency.</p>
<p><img src="/assets/images/blog/log-file-custom_metric_columns_chart_w_threshold.png" alt="graphing log data produced from continuous integration" title="Testspace supports thresholds used for alerts!" /></p>
<h2 id="get-setup-in-minutes">Get setup in minutes!</h2>
<p>Try <a href="/pricing">Testspace</a> risk-free. No credit card is required.</p>
<p>Have questions? <a href="mailto:contact@testspace.com">Contact us.</a></p>Testspace TeamA common practice for many development organizations, including ours, is to collect logs and data during automated testing. Whether this is done to provide context for failures or as secondary quality measurements, ancillary data can be a vital part of any Continuous Integration (CI) process. The challenge is how to collect and use the data efficiently and effectively, one that doesn’t include the manual triage and dissemination of data after each run.Test Analytics for Continuous Integration2021-01-05T00:00:00+00:002021-01-05T00:00:00+00:00https://www.testspace.com/blog/test-analytics-for-continuous-integration<p>We are introducing our second version of Test Analytics - <code class="language-plaintext highlighter-rouge">Testspace Insights</code>. This article provides an overview of the feature set and provides a brief description of other important metrics being considered for future releases.</p>
<h1 id="what-is-the-purpose-of-test-analytics">What is the Purpose of Test Analytics?</h1>
<p><strong>Used to optimize the change-build-test workflow.</strong> When complex software is being built, with large and distributed teams, <strong>time</strong> is
one of the most challenging items to manage. The efficiency of this <em>workflow</em> directly impacts the overall time required to
release software – <em>aka <a href="https://en.wikipedia.org/wiki/Release_management" target="_blank">Release Management</a></em>. Release Management
is the process of managing, planning, scheduling, and controlling a software build through different stages; including testing and deploying.</p>
<blockquote>
<p>Insights facilitate the improvement of the change-build-test workflow.</p>
</blockquote>
<p>So, what are some of the <strong>time-sinks</strong> of the <code class="language-plaintext highlighter-rouge">change-build-test workflow</code>?</p>
<ul>
<li><strong>Insufficient test coverage</strong>: The automated tests are poorly designed and don’t identify enough side-effects from source code changes. Thus, defects escape into the next phase.</li>
<li><strong>Poor test failure management</strong>: Ignoring test failures while the software continues to change. Tracking and responding to test failures using console output, email, and other archaic techniques. Requiring a heavy and time-consuming bug tracking system to manage resolving failures.</li>
<li><strong>Unstable results</strong>: Weak test automation, random test failures, and flaky tests all combined can generate noise, resulting in time-wasting activities.</li>
<li><strong>Lack of transparency</strong>: What is the current status? Are we progressing? Are members communicating? Meetings, emails, and handcrafted reports are useful, but visibility solely based on these traditional methods, versus mined data, can inhibit quick and timely informed decision making.</li>
</ul>
<p>Using <strong>Insights</strong> generated from the <code class="language-plaintext highlighter-rouge">change-build-test workflow</code> can significantly improve team member engagement and better enable driving operational effectiveness for releasing software.</p>
<h1 id="connects-to-existing-automation">Connects to Existing Automation</h1>
<p>Testspace connects to your <code class="language-plaintext highlighter-rouge">change-build-test workflow</code> using a simple command line utility. The utility is used to push content from your test automation system to the Testspace server. Content such as build status, test results, code coverage, source code changes, etc., are automatically collected, stored, analyzed, and then <strong>published with the current status</strong>. Computational analysis is continuously performed on the historical data using regression patterns, test effectiveness calculations, trends in coverage, and rates for resolving failures; all used <em>together</em> to generate <strong>Insights</strong>.</p>
<h1 id="how-to-use-the-indicators">How to use the Indicators</h1>
<p>The <code class="language-plaintext highlighter-rouge">Indicators</code> are calculated based on historical data collected for the selected period. The 3 Indicators focus on different areas, all related to providing more visibility into the <strong>efficiency</strong> of the <code class="language-plaintext highlighter-rouge">change-build-test workflow</code>. These indicators all work in concert, using mined data for continuous assessments.</p>
<p><img src="/assets/images/blog/test-analytics.jpg" alt="Test Analytics" title="Analytics- Insights to improve process!" /></p>
<p>The core tenets of an efficient workflow are:</p>
<ul>
<li>stable and consistent test results</li>
<li>automated tests that are capturing side-effects (i.e. failing tests) from source code changes</li>
<li>resolution of test failures at a reasonable rate</li>
</ul>
<blockquote>
<p>Efficient workflows generate healthy regressions, capture side-effects, and resolve these failures at a rapid rate.</p>
</blockquote>
<p>In addition to the Indicators, other important metrics are also provided showing <code class="language-plaintext highlighter-rouge">system instability</code>, <code class="language-plaintext highlighter-rouge">high-frequency failures</code> (i.e. team ignoring the failures), and <code class="language-plaintext highlighter-rouge">trends in coverage</code>.</p>
<h1 id="results-strength">Results Strength</h1>
<p>The <code class="language-plaintext highlighter-rouge">Results Strength</code> indicator is used to provide a <em>macro view</em> of the consistency of results being generated by your test automation. The very first step in optimizing the <em>workflow</em> is stabling automated test results. Without a <em>reasonable</em> level of consistency from testing, it is nearly impossible to make process improvements.</p>
<blockquote>
<p>Improvement requires consistent test results.</p>
</blockquote>
<p><img src="/assets/images/blog/test-analytics-results-strength.png" alt="Test Analytics Results Strength" title="Results Strength" /></p>
<p>A combination of the overall <em>Passing average</em> and the <em>Healthy Results average</em> are used. <em>Healthy Results</em> are the percentage of passing tests determined to be “<em>reasonable</em>” based on the current stage of development. The default is <code class="language-plaintext highlighter-rouge">100%</code>, but this can be defined by the team. By tracking both the average <em>Pass rate</em> and <em>Health rate</em>, the <code class="language-plaintext highlighter-rouge">Results Strength</code> indicator provides insight into the collective strength of the software under test for the selected time period.</p>
<p>Also factored in are results tagged as <code class="language-plaintext highlighter-rouge">Invalid</code>. This is based on significant variations from previous results such as drops in case count.</p>
<p>For more information regarding <code class="language-plaintext highlighter-rouge">Results Strength</code> refer <a href="https://help.testspace.com/dashboard/project-insights#results-strength" target="_blank">here</a>.</p>
<h1 id="test-effectiveness">Test Effectiveness</h1>
<p>The <code class="language-plaintext highlighter-rouge">Test Effectiveness</code> indicator is used for assessing how good the automated tests are at capturing side-effects, based on source code changes – i.e. generating <strong>new</strong> failures. These new failures have had a minimum of 5 non-failing test statuses. In general, new failures should be the result of source code changes – otherwise, instability of the application/infrastructure causes additional quality risk (i.e. random factors). If the source code is changing at a rapid rate, with no regressions, there may be a problem with the <em>usefulness</em> of the existing tests.</p>
<blockquote>
<p>Reasonable test regressions are considered healthy and beneficial to the workflow.</p>
</blockquote>
<p><img src="/assets/images/blog/test-analytics-test-effectiveness.png" alt="Test Analytics Test Effectiveness" title="Test Effectiveness" /></p>
<p>For more information regarding <code class="language-plaintext highlighter-rouge">Test Effectiveness</code> refer <a href="https://help.testspace.com/dashboard/project-insights#test-effectiveness" target="_blank">here</a>.</p>
<h1 id="workflow-efficiency">Workflow Efficiency</h1>
<p>The <code class="language-plaintext highlighter-rouge">Workflow Efficiency</code> indicator is used to provide a macro view of the efficiency of resolving test case failures. Lots of
regressions can indicate that the tests are effective, but even with good test coverage you still want rapid recovery. Referring to
Martin Flower in his popular <a href="https://martinfowler.com/articles/continuousIntegration.html" target="_blank">Continuous Integration</a> article, failures should be addressed immediately. New test failures that are not addressed, in a reasonable timeframe, are considered “drifts” and add to the quality debt! By tracking the overall <em>Failures Resolved %</em> and the <em>Average Resolution Time</em>, teams can assess if their failure resolution process is acceptable.</p>
<blockquote>
<p>Letting test failures drift, while source file changes continue, adds considerable effort to triage activity.</p>
</blockquote>
<p><img src="/assets/images/blog/test-analytics-workflow-efficiency.png" alt="Test Analytics Workflow Efficiency" title="Workflow Efficiency" /></p>
<p>For more information regarding <code class="language-plaintext highlighter-rouge">Workflow Efficiency</code> refer <a href="https://help.testspace.com/dashboard/project-insights#workflow-efficiency" target="_blank">here</a>.</p>
<h1 id="our-data">Our Data</h1>
<p>We measure and track <em>our</em> Testspace development, and you can see it <a href="https://s2.testspace.com/projects/s2technologies:testspace/insights" target="_blank">HERE</a>.<br /><br />
The Testspace server is developed using <a href="http://rubyonrails.org/" target="_blank">Ruby on Rails</a>, the source code is hosted using <a href="https://github.com/" target="_blank">GitHub</a>, and we use <a href="https://circleci.com" target="_blank">Circle CI</a> for test automation.</p>
<p><img src="/assets/images/blog/test-analytics-graph.png" alt="Test Analytics for Continuous Integration" title="Test Analytics for CI" /></p>
<p>For more information regarding Insights refer to our help documentation <a href="https://help.testspace.com/dashboard/project-insights">here</a>.</p>
<h1 id="our-model-is-constantly-improving">Our Model is constantly improving</h1>
<p>We are constantly monitoring and improving our <code class="language-plaintext highlighter-rouge">algorithms</code> and <code class="language-plaintext highlighter-rouge">statistical models</code> to extract more useful and <strong>actionable</strong> data that customers can use to improve their software development process. We track regression patterns, resolution rates for failures, risk related to change, and trends in code coverage. To continue improving Testspace Insights, we will continue to collect more data such as:</p>
<ul>
<li>Code Review cycle - GitHub Pull Request, GitLab Merge requests, Gerrit Code, etc.</li>
<li>Defect Tracking tool - Jira, GitHub Issues, etc.</li>
<li>Failure triage - tracking user review of test failures.</li>
</ul>
<p>As a company, our focus is to better enable software development teams to <strong>optimize the change-build-test workflow</strong> by leveraging all of the data they generate during development. Building a model that adapts, learns, and prescribes in an automated fashion is the focus of Testspace.</p>
<blockquote>
<p>Don’t let <strong>your data</strong> get dropped on the floor</p>
</blockquote>
<h2 id="get-setup-in-minutes">Get setup in minutes!</h2>
<p>Try <a href="/pricing">Testspace</a> risk-free. No credit card is required.</p>
<p>Have questions? <a href="mailto:contact@testspace.com">Contact us.</a></p>Mark UndersethWe are introducing our second version of Test Analytics - Testspace Insights. This article provides an overview of the feature set and provides a brief description of other important metrics being considered for future releases.Integration with 3rd Party Code Coverage2020-03-04T00:00:00+00:002020-03-04T00:00:00+00:00https://www.testspace.com/blog/integration-with-3rd-party-code-coverage<p>Testspace supports integrations with two of the leading online Code Coverage providers; <a href="https://about.codecov.io" target="_blank">Codecov.io</a> and <a href="https://coveralls.io" target="_blank">Coveralls.io</a>. Code coverage metrics are essential to effective Test Management and should be an integral part of any Continuous Integration (CI) system.</p>
<p>Although not a measurement of correctness, coverage metrics are essential for understanding and addressing coverage gaps or for maintaining a minimum level of coverage as the code base continues to grow. Testspace makes the inclusion of coverage metrics simple.</p>
<blockquote>
<p>Testspace adds more Quality Metrics to Test Management</p>
</blockquote>
<h1 id="adding-testspace-to-ci-automation-takes-minutes">Adding Testspace to CI Automation Takes Minutes</h1>
<p>Files – output from the Continuous Integration Process – are pushed to the Testspace server
using a simple client. <a href="https://help.testspace.com/publish/push-data-add-to-ci" target="_blank">Adding Testspace to online CI</a> requires adding the client utility to
the install process, and then using the client to push the files and metrics after
all tests have run; including build logs, static analysis, test results, and all
types of standard and custom metrics.</p>
<pre>
$ testspace.exe analysis.xml test_results.xml
</pre>
<h1 id="adding-3rd-party-code-coverage-to-testspace-takes-seconds">Adding 3rd Party Code Coverage to Testspace Takes Seconds</h1>
<p>To include your Codecov or Coveralls results, simply add the appropriate 3rd party link option to the Testspace command.</p>
<pre>
$ testspace.exe analysis.xml test_results.xml --link=coveralls
</pre>
<p>Upon running a new build and pushing the first set of results to Testspace, coverage badges will be added to the
appropriate branch/space on your Testspace Project listing - automatically.</p>
<p><img src="/assets/images/blog/integration-with-3rd-party-code-coverage-badges.png" alt="Integration with 3rd Party Code Code Badges" title="Testspace works with 3rd Party Badges" /></p>
<p>Thresholds, defined by your 3rd party coverage settings, will be used in the overall determination of
software branch health.</p>
<readmore-help>
<a class="btn-arrow" href="https://help.testspace.com/dashboard/space-results#health" target="_blank">
Read more
</a>
</readmore-help>
<p><img src="/assets/images/blog/integration-with-3rd-party-code-coverage-graph.png" alt="Coveralls and Codecov Coverage Chart" title="Showing working with Coderalls and Codecov" /></p>
<p>Charts are automatically added to the <a href="https://help.testspace.com/dashboard/space-metrics" target="_blank">Metrics Tab</a> with</p>
<ol>
<li>3rd party badge - quick link to coverage details</li>
<li>Percentage of line coverage for each build</li>
<li>Threshold for build results to be deemed healthy</li>
</ol>
<readmore-help>
<a class="btn-arrow" href="https://help.testspace.com/dashboard/space-metrics#code-coverage" target="_blank">
Read more
</a>
</readmore-help>
<h1 id="other-useful-resources">Other Useful Resources</h1>
<ul>
<li><a href="https://help.testspace.com/publish/push-data-results" target="_blank">How to Push Data to Testspace and the types of files supported.</a></li>
<li><a href="https://help.testspace.com/reference/data-formats#code-coverage-formats" target="_blank">Other Code Coverage formats supported.</a></li>
<li><a href="https://help.testspace.com/dashboard/space-metrics#standard" target="_blank">Testspace Standard Metrics.</a></li>
<li><a href="https://help.testspace.com/dashboard/space-metrics#custom" target="_blank">Testspace Custom Metrics.</a></li>
</ul>
<readmore class="readmore">
<a class="btn-arrow" href="/blog/managing-software-quality-under-continuous-integration"> Read more </a>
</readmore>
<h2 id="get-setup-in-minutes">Get setup in minutes!</h2>
<p>Try <a href="/pricing">Testspace</a> risk-free. No credit card is required.</p>
<p>Have questions? <a href="mailto:contact@testspace.com">Contact us.</a></p>Testspace TeamTestspace supports integrations with two of the leading online Code Coverage providers; Codecov.io and Coveralls.io. Code coverage metrics are essential to effective Test Management and should be an integral part of any Continuous Integration (CI) system.Testspace Integration with GitLab2019-11-21T00:00:00+00:002019-11-21T00:00:00+00:00https://www.testspace.com/blog/testspace-integration-with-gitlab<p>Here at Testspace we’re excited to introduce yet another Online VCS/CI Integration. Users of <strong>GitLab</strong> - a leader
in Continuous Integration software - can now connect Testspace to their git repositories.</p>
<p><img src="/assets/images/blog/gitlab-plus-testspace.png" alt="GitLab Continuous Integration Dashboard" title="Testspace and GitLab" /></p>
<blockquote>
<p>Testspace provides Quality Analytics to the GitLab Workflow.</p>
</blockquote>
<h1 id="connecting-testspace-to-your-gitlab-groups-is-simple">Connecting Testspace to your GitLab Groups is simple</h1>
<p>With Testspace connected services, there are no plugins to install or configure. Testspace owners and admins
simply connect to their GitLab Groups with a single click.</p>
<p><img src="/assets/images/blog/gitlab-groups.png" alt="GitLab Groups" title="Testspace connects to GitLab Groups" /></p>
<readmore-help>
<a class="btn-arrow" href="https://help.testspace.com/dashboard/admin-account#services" target="_blank">
Read more
</a>
</readmore-help>
<p>Once connected, creating a Testspace project is as simple as selecting a GitLab repository from a list.</p>
<p><img src="/assets/images/blog/gitlab-new-project.png" alt="GitLab Repos" title="Testspace automatically discovers GitLab Repos" /></p>
<readmore-help>
<a class="btn-arrow" href="https://help.testspace.com/dashboard/project" target="_blank">
Read more
</a>
</readmore-help>
<h1 id="adding-testspace-to-the-gitlab-automation-takes-minutes">Adding Testspace to the GitLab Automation Takes Minutes</h1>
<p>Files – output from the Continuous Integration Process – are pushed to the Testspace server using a simple client.
<a href="https://help.testspace.com/publish/push-data-add-to-ci#gitlab-ci" target="_blank">Adding Testspace to online CI</a> is merely as simple as adding a <strong>client utility</strong> to
the install process.</p>
<p>Once setup…</p>
<p><img src="/assets/images/blog/gitlab-ci-yml.png" alt="GitLab CI YML" title="Simple to include Testspace using a GitLab CI yml file" /></p>
<ul>
<li>Developers receive alerts when branches regress, from any data source or metric.</li>
<li>Leads set and monitor quality objectives for a development branch.</li>
<li>Managers review and assess overall quality effectiveness spanning multiple repositories.</li>
</ul>
<blockquote>
<p>Testspace has zero impact on the workflow!</p>
</blockquote>
<h1 id="so-why-would-you-use-testspace-with-gitlab">So why would you use Testspace with GitLab?</h1>
<ul>
<li>To dashboard test results from every branch under every repository.</li>
<li>To traige and manage test failures.</li>
<li>To track the history of failing test cases.</li>
<li>To monitor standard and custom metrics.</li>
<li>To add Quality Analitics to GitLab workflow.</li>
</ul>
<readmore class="readmore">
<a class="btn-arrow" href="/blog/managing-software-quality-under-continuous-integration"> Read more </a>
</readmore>
<h2 id="get-setup-in-minutes">Get setup in minutes!</h2>
<p>Try <a href="/pricing">Testspace</a> risk-free. No credit card is required.</p>
<p>Have questions? <a href="mailto:contact@testspace.com">Contact us.</a></p>Testspace TeamHere at Testspace we’re excited to introduce yet another Online VCS/CI Integration. Users of GitLab - a leader in Continuous Integration software - can now connect Testspace to their git repositories.Introducing Testspace Insights2019-09-12T00:00:00+00:002019-09-12T00:00:00+00:00https://www.testspace.com/blog/introducing-testspace-insights<p>We are introducing the first version of <strong>Testspace Insights</strong>. This article provides an overview and sets the stage for
future articles on the details of Insights and how to leverage them to improve the quality of the software development process.</p>
<p><img src="/assets/images/blog/insights-1.png" alt="Insights" title="Introducing Testspace Insights" /></p>
<h1 id="improving-software-quality">Improving Software Quality</h1>
<p>One approach to improving software quality is to improve the development process. Although the approach sounds simple in theory, execution comes with many challenges. Knowing when and where to make process improvements are among the most difficult. A lack of visibility into the quality of the current process coupled with the inability to track improvements leads to decisions based on <code class="language-plaintext highlighter-rouge">hunches and intuition without real data</code>. We’re not saying using hunches and intuition is bad, but improvements are better with real data.</p>
<h1 id="so-what-are-insights">So what are Insights?</h1>
<p>When using Testspace during the development cycle <em>data</em> generated from testing is automatically collected, stored, and continuously analyzed. This <em>mined data</em> is used to generate Insights. Insights are <code class="language-plaintext highlighter-rouge">indicators</code> and <code class="language-plaintext highlighter-rouge">metrics</code> used to assess and make process decisions concerning the quality of the software development process. Each indicator provides an overview status for a specific process area. Indicators help answer questions like</p>
<ul>
<li><em>Is the current testing providing value to the team?</em></li>
<li><em>Are regressions being addressed in a timely fashion?</em></li>
<li><em>Are changes in process improving the quality?</em></li>
</ul>
<blockquote>
<p>Insights are used to measure, track, and manage the quality of the software development process, making informed decisions based on analytics.</p>
</blockquote>
<p>Having <code class="language-plaintext highlighter-rouge">actionable data</code> is key to assessing and improving any process. As the project proceeds and grows in complexity, this collected data becomes more valuable. So, instead of just dropping it on the floor, Testspace uses it to help you understand how the project is really doing.</p>
<h1 id="how-does-it-work">How does it work?</h1>
<p>By adding a <em>simple command line utility</em> to the test automation process, <em>test output</em> is pushed to the Testspace server without impacting the workflow. The following command line example pushes <code class="language-plaintext highlighter-rouge">test results</code>, <code class="language-plaintext highlighter-rouge">code coverage</code>, and <code class="language-plaintext highlighter-rouge">static analysis</code> information.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>testspace tests*.xml coverage.xml static-analysis.xml
</code></pre></div></div>
<p>In addition to the test output, workflow activities — including new software branches, source code changes, new regressions, and new fixes — are captured as <em>analytical data</em> automatically.</p>
<p>Each project’s <em>data</em> is stored in a database and continuously analyzed for patterns and trends. Metrics are generated and provide historical tracking of items like coverage improvements, technical debt, trends in regressions, and more.</p>
<h1 id="our-data">Our Data</h1>
<p>We measure and track <em>our own</em> Testspace development, and you can see it <a href="https://s2.testspace.com/projects/s2technologies:testspace/insights" target="_blank">HERE</a>.<br />
The Testspace server is developed using <a href="http://rubyonrails.org/" target="_blank">Ruby on Rails</a>, the
source code is hosted using <a href="https://github.com/" target="_blank">GitHub</a>, and we use <a href="https://circleci.com" target="_blank">Circle CI</a> for test automation.</p>
<p><img src="/assets/images/blog/insights-1-graph.png" alt="Test Analytics for Continuous Integration" title="Testspace Insights for Continuous Integration" /></p>
<p>For more information regarding Insights refer to our help documentation <a href="https://help.testspace.com/dashboard/project-insights" target="_blank">here</a>.</p>
<h1 id="our-model-is-constantly-improving">Our Model is constantly improving</h1>
<p>We are constantly monitoring and improving our <code class="language-plaintext highlighter-rouge">algorithms</code> and <code class="language-plaintext highlighter-rouge">statistical models</code> to extract out more useful
and <strong>actionable</strong> data that customers can use to improve their software development process. We track regression
patterns, resolution rates for failures, risk related to change, trends in code coverage, soon to include defect
tracking and other variables.</p>
<p>As a company, our focus is to better enable software development teams to leverage all of the data they
generate during development. Building a model that adapts, learns, and prescribes in an automated fashion
is the focus of Testspace.</p>
<blockquote>
<p>Don’t let <strong>your data</strong> get dropped on the floor</p>
</blockquote>
<h2 id="get-setup-in-minutes">Get setup in minutes!</h2>
<p>Try <a href="/pricing">Testspace</a> risk-free. No credit card is required.</p>
<p>Have questions? <a href="mailto:contact@testspace.com">Contact us.</a></p>Testspace TeamWe are introducing the first version of Testspace Insights. This article provides an overview and sets the stage for future articles on the details of Insights and how to leverage them to improve the quality of the software development process.