#2963 closed defect (fixed)
Add a testsuite top level confguration file that is common to all tests.
Reported by: | Chris Johns | Owned by: | Chris Johns |
---|---|---|---|
Priority: | normal | Milestone: | 5.1 |
Component: | unspecified | Version: | 5 |
Severity: | normal | Keywords: | testing |
Cc: | Blocked By: | ||
Blocking: |
Description
Add the file testsuites/rtems.tcfg
to hold test states common to all BSPs. This lets us globally set a test state.
For example fileio
is user-input
.
Note, user-input
will be added a test state to test this file.
Change History (8)
comment:1 Changed on 03/31/17 at 04:59:42 by Sebastian Huber
comment:2 follow-up: 3 Changed on 03/31/17 at 06:26:42 by Chris Johns
I have never looked at the gcc testsuite and from what I have read about it was not flattering.
I do not think adding 500+ files to state fileio
is a user-input
test and will never complete is not good. Maybe global is not a great word, maybe common is better. We need accurate data to determine the results of tests.
It is similar to the work you have been doing to have a common linkercmd file where ever possible. It is the same thing or are you saying we should create a separate linker command file for every bsp as well? ;)
Look at the results with a work in progress rtems-test
for erc32-run
:
Passed: 546 Failed: 1 User Input: 4 Expected Fail: 0 Indeterminate: 0 Timeout: 6 Invalid: 1 ------------------ Total: 558 Failures: spcontext01.exe User Input: fileio.exe top.exe termios.exe monitor.exe Timeouts: jffs2_fssymlink.exe mrfs_fserror.exe dhrystone.exe fsdosfsformat01.exe imfs_fsrdwr.exe whetstone.exe Invalid: minimum.exe Average test time: 0:00:00.481800 Testing time : 0:04:28.844749
Note, the benchmark tests have broken parallel testing because of the time they now take.
comment:3 follow-up: 4 Changed on 03/31/17 at 06:32:12 by Sebastian Huber
Replying to Chris Johns:
I have never looked at the gcc testsuite and from what I have read about it was not flattering.
My comment was not about the GCC testsuite in general.
I do not think adding 500+ files to state
fileio
is auser-input
test and will never complete is not good. Maybe global is not a great word, maybe common is better. We need accurate data to determine the results of tests.
Why 500+ files, its just one:
diff --git a/testsuites/samples/fileio/init.c b/testsuites/samples/fileio/init.c index 07ec2c6..68942e8 100644 --- a/testsuites/samples/fileio/init.c +++ b/testsuites/samples/fileio/init.c @@ -34,6 +34,7 @@ #include <rtems/nvdisk-sram.h> #include <rtems/shell.h> +/* FANCY TEST COMMENT: user-input */ const char rtems_test_name[] = "FILE I/O"; #if FILEIO_BUILD
It is similar to the work you have been doing to have a common linkercmd file where ever possible. It is the same thing or are you saying we should create a separate linker command file for every bsp as well? ;)
Look at the results with a work in progress
rtems-test
forerc32-run
:
Passed: 546 Failed: 1 User Input: 4 Expected Fail: 0 Indeterminate: 0 Timeout: 6 Invalid: 1 ------------------ Total: 558 Failures: spcontext01.exe User Input: fileio.exe top.exe termios.exe monitor.exe Timeouts: jffs2_fssymlink.exe mrfs_fserror.exe dhrystone.exe fsdosfsformat01.exe imfs_fsrdwr.exe whetstone.exe Invalid: minimum.exe Average test time: 0:00:00.481800 Testing time : 0:04:28.844749Note, the benchmark tests have broken parallel testing because of the time they now take.
On my host these benchmark tests did run less than 3 minutes.
comment:4 Changed on 03/31/17 at 07:05:39 by Chris Johns
Replying to Sebastian Huber:
Replying to Chris Johns:
I do not think adding 500+ files to state
fileio
is auser-input
test and will never complete is not good. Maybe global is not a great word, maybe common is better. We need accurate data to determine the results of tests.
Why 500+ files, its just one:
diff --git a/testsuites/samples/fileio/init.c b/testsuites/samples/fileio/init.c index 07ec2c6..68942e8 100644 --- a/testsuites/samples/fileio/init.c +++ b/testsuites/samples/fileio/init.c @@ -34,6 +34,7 @@ #include <rtems/nvdisk-sram.h> #include <rtems/shell.h> +/* FANCY TEST COMMENT: user-input */ const char rtems_test_name[] = "FILE I/O"; #if FILEIO_BUILD
Sure I thought you were talking about tcfg files. There is no standard for this plus and how does the comment get to rtems-test
.
I am leveraging the expected-fail
mechanism to handle this. That needs to be external to test.
All I am doing is collecting these things into a common place and a common framework.
It is similar to the work you have been doing to have a common linkercmd file where ever possible. It is the same thing or are you saying we should create a separate linker command file for every bsp as well? ;)
Look at the results with a work in progress
rtems-test
forerc32-run
:
Passed: 546 Failed: 1 User Input: 4 Expected Fail: 0 Indeterminate: 0 Timeout: 6 Invalid: 1 ------------------ Total: 558 Failures: spcontext01.exe User Input: fileio.exe top.exe termios.exe monitor.exe Timeouts: jffs2_fssymlink.exe mrfs_fserror.exe dhrystone.exe fsdosfsformat01.exe imfs_fsrdwr.exe whetstone.exe Invalid: minimum.exe Average test time: 0:00:00.481800 Testing time : 0:04:28.844749Note, the benchmark tests have broken parallel testing because of the time they now take.
On my host these benchmark tests did run less than 3 minutes.
All cores fully loaded?
comment:5 Changed on 04/03/17 at 22:11:13 by Chris Johns
Owner: | changed from joel.sherrill@… to Chris Johns |
---|
comment:6 Changed on 04/03/17 at 23:52:54 by Chris Johns <chrisj@…>
Resolution: | → fixed |
---|---|
Status: | assigned → closed |
In 258bda3/rtems:
comment:7 Changed on 10/10/17 at 06:46:55 by Sebastian Huber
Component: | testing → unspecified |
---|
comment:8 Changed on 11/09/17 at 06:27:14 by Sebastian Huber
Milestone: | 4.12.0 → 5.1 |
---|
Milestone renamed
I am not very fond of global files. The tests should be self contained. Maybe add some special comments to the test sources like in the GCC test suite?