#2963 closed defect (fixed)

Add a testsuite top level confguration file that is common to all tests.

Reported by: Chris Johns Owned by: Chris Johns
Priority: normal Milestone: 5.1
Component: unspecified Version: 5
Severity: normal Keywords: testing
Cc: Blocked By:
Blocking:

Description

Add the file testsuites/rtems.tcfg to hold test states common to all BSPs. This lets us globally set a test state.

For example fileio is user-input.

Note, user-input will be added a test state to test this file.

Change History (8)

comment:1 Changed on 03/31/17 at 04:59:42 by Sebastian Huber

I am not very fond of global files. The tests should be self contained. Maybe add some special comments to the test sources like in the GCC test suite?

/* { dg-options "-mthumb -Os" } */
/* { dg-require-effective-target arm_thumb2_ok } */
/* { dg-final { scan-assembler "ands" } } */

comment:2 Changed on 03/31/17 at 06:26:42 by Chris Johns

I have never looked at the gcc testsuite and from what I have read about it was not flattering.

I do not think adding 500+ files to state fileio is a user-input test and will never complete is not good. Maybe global is not a great word, maybe common is better. We need accurate data to determine the results of tests.

It is similar to the work you have been doing to have a common linkercmd file where ever possible. It is the same thing or are you saying we should create a separate linker command file for every bsp as well? ;)

Look at the results with a work in progress rtems-test for erc32-run:

Passed:        546
Failed:          1
User Input:      4
Expected Fail:   0
Indeterminate:   0
Timeout:         6
Invalid:         1
------------------
Total:         558

Failures:
 spcontext01.exe
User Input:
 fileio.exe
 top.exe
 termios.exe
 monitor.exe
Timeouts:
 jffs2_fssymlink.exe
 mrfs_fserror.exe
 dhrystone.exe
 fsdosfsformat01.exe
 imfs_fsrdwr.exe
 whetstone.exe
Invalid:
 minimum.exe
Average test time: 0:00:00.481800
Testing time     : 0:04:28.844749

Note, the benchmark tests have broken parallel testing because of the time they now take.

comment:3 in reply to:  2 ; Changed on 03/31/17 at 06:32:12 by Sebastian Huber

Replying to Chris Johns:

I have never looked at the gcc testsuite and from what I have read about it was not flattering.

My comment was not about the GCC testsuite in general.

I do not think adding 500+ files to state fileio is a user-input test and will never complete is not good. Maybe global is not a great word, maybe common is better. We need accurate data to determine the results of tests.

Why 500+ files, its just one:

diff --git a/testsuites/samples/fileio/init.c b/testsuites/samples/fileio/init.c
index 07ec2c6..68942e8 100644
--- a/testsuites/samples/fileio/init.c
+++ b/testsuites/samples/fileio/init.c
@@ -34,6 +34,7 @@
 #include <rtems/nvdisk-sram.h>
 #include <rtems/shell.h>
 
+/* FANCY TEST COMMENT: user-input */
 const char rtems_test_name[] = "FILE I/O";
 
 #if FILEIO_BUILD

It is similar to the work you have been doing to have a common linkercmd file where ever possible. It is the same thing or are you saying we should create a separate linker command file for every bsp as well? ;)

Look at the results with a work in progress rtems-test for erc32-run:

Passed:        546
Failed:          1
User Input:      4
Expected Fail:   0
Indeterminate:   0
Timeout:         6
Invalid:         1
------------------
Total:         558

Failures:
 spcontext01.exe
User Input:
 fileio.exe
 top.exe
 termios.exe
 monitor.exe
Timeouts:
 jffs2_fssymlink.exe
 mrfs_fserror.exe
 dhrystone.exe
 fsdosfsformat01.exe
 imfs_fsrdwr.exe
 whetstone.exe
Invalid:
 minimum.exe
Average test time: 0:00:00.481800
Testing time     : 0:04:28.844749

Note, the benchmark tests have broken parallel testing because of the time they now take.

On my host these benchmark tests did run less than 3 minutes.

comment:4 in reply to:  3 Changed on 03/31/17 at 07:05:39 by Chris Johns

Replying to Sebastian Huber:

Replying to Chris Johns:

I do not think adding 500+ files to state fileio is a user-input test and will never complete is not good. Maybe global is not a great word, maybe common is better. We need accurate data to determine the results of tests.

Why 500+ files, its just one:

diff --git a/testsuites/samples/fileio/init.c b/testsuites/samples/fileio/init.c
index 07ec2c6..68942e8 100644
--- a/testsuites/samples/fileio/init.c
+++ b/testsuites/samples/fileio/init.c
@@ -34,6 +34,7 @@
 #include <rtems/nvdisk-sram.h>
 #include <rtems/shell.h>
 
+/* FANCY TEST COMMENT: user-input */
 const char rtems_test_name[] = "FILE I/O";
 
 #if FILEIO_BUILD

Sure I thought you were talking about tcfg files. There is no standard for this plus and how does the comment get to rtems-test.

I am leveraging the expected-fail mechanism to handle this. That needs to be external to test.

All I am doing is collecting these things into a common place and a common framework.

It is similar to the work you have been doing to have a common linkercmd file where ever possible. It is the same thing or are you saying we should create a separate linker command file for every bsp as well? ;)

Look at the results with a work in progress rtems-test for erc32-run:

Passed:        546
Failed:          1
User Input:      4
Expected Fail:   0
Indeterminate:   0
Timeout:         6
Invalid:         1
------------------
Total:         558

Failures:
 spcontext01.exe
User Input:
 fileio.exe
 top.exe
 termios.exe
 monitor.exe
Timeouts:
 jffs2_fssymlink.exe
 mrfs_fserror.exe
 dhrystone.exe
 fsdosfsformat01.exe
 imfs_fsrdwr.exe
 whetstone.exe
Invalid:
 minimum.exe
Average test time: 0:00:00.481800
Testing time     : 0:04:28.844749

Note, the benchmark tests have broken parallel testing because of the time they now take.

On my host these benchmark tests did run less than 3 minutes.

All cores fully loaded?

Last edited on 03/31/17 at 07:06:13 by Chris Johns (previous) (diff)

comment:5 Changed on 04/03/17 at 22:11:13 by Chris Johns

Owner: changed from joel.sherrill@… to Chris Johns

comment:6 Changed on 04/03/17 at 23:52:54 by Chris Johns <chrisj@…>

Resolution: fixed
Status: assignedclosed

In 258bda3/rtems:

testsuite: Add a common test configuration. Fix configure.ac and Makefile.am errors.

  • Add a top level test configuration file for test states that are common to all BSPs. This saves adding a test configuration (tcfg) file for every BSP.
  • Add the test states 'user-input' and 'benchmark'. This lets 'rtems-test' stop the test rather than waiting for a timeout or letting a benchmark run without the user asking for it to run.
  • Implement rtems-test-check in Python to make it faster. The shell script had grown to a point it was noticably slowing the build down.
  • Fix the configure.ac and Makefile.am files for a number of the test directories. The files are difficiult to keep in sync with the number of tests and mistakes can happen such as tests being left out of the build. The test fsrofs01 is an example. Also a there was a mix of SUBDIRS and _SUBDIRS being used and only _SUBDIRS should be used.
  • Fix the test fsrofs01 so it compiles.

Closes #2963.

comment:7 Changed on 10/10/17 at 06:46:55 by Sebastian Huber

Component: testingunspecified

comment:8 Changed on 11/09/17 at 06:27:14 by Sebastian Huber

Milestone: 4.12.05.1

Milestone renamed

Note: See TracTickets for help on using tickets.