forked from testng-team/testng
-
Notifications
You must be signed in to change notification settings - Fork 0
/
TODO.txt
160 lines (136 loc) · 7.22 KB
/
TODO.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
TODO for TestNG
* Pass the XmlTest in @Before/@After methods
* Allow a testng.xml file to be passed when -testjar is used
* Add onStart to IConfigurationListener (create a new interface, actually)
* Add timeout to @Before/@After
* Pass parameters from ant
* Make it possible to specify groups on command line and classes in testng.xml
(and any combinations thereof: command line, ant, testng.xml)
* DataProvider index in testng.xml
* Create a servlet for remote driving
* Add time-outs at the testng.xml. Also: test and suite time-outs? http://tinyurl.com/kbwxq
* Add working dir to the ant task
* Add a servlet so TestNG can be invoked from a web browser
* Make it possible to add listeners from the Eclipse plug-in
Doc:
* Document the fact that @Test methods with return values are ignored.
===========================================================================
Older TODO's:
* Show enabled=false methods in the reports, as well as methods
in groups that were not run
* Multi-threading for invocationCount and maybe for <test> too
* Annotation to specify that a method should be called concurrently by n threads
(on second thought, we should do that for an entire group)
* more thread ideas: http://www.theserverside.com/news/thread.tss?thread_id=38922
* package support for command line and ant
* Parameters for classes (to be passed as parameters to constructors)
* testng-dist.zip should contain a top-level directory
* For dependent methods, if the user is trying to run an incomplete graph
(A depends on B but B is being excluded from the run), what to do? Ignore
the exclusion on B and run it anyway, or abort with an exception explaining
what's happening?)
* Make timeOut() work with milliseconds (but keep seconds for backward
compatibility)
* Improve the plug-in API so people can add listeners without having to
modify TestRunner
* Use factories for the programmatic API.
* Add dynamic generation of tests
* Make Javadoc comments over methods appear in the final report
Documentation:
* IHookable
* List<IReporter>
DONE
* Retry patch
* If a method with invocationCount fails, don't run the others
* Allow multiple listeners in ant task
* Add working dir to the ant task
* Introduce "test" and "suite" parameters to @Test at the class level to
avoid having to use testng.xml
* Remove TestNG stack traces from the report
* When 0 tests were run, exit with an error http://tinyurl.com/ftng6
* Show all the extra output for all methods in a single
dedicated page
* report API
* Show parameters used to invoke a specific test
* show skipped groups/methods in HTML report
* beforeTestGroups
* <testng classfileset> doesn't add to classpath
* threadPoolSize
* Parameter logging
* JavaDoc for org.testng.TestNG
* org.testng.Reporter in the HTML report (screenshot ready)
* Document @Parameters only works for @Test (should mention @Factory and @Configuration)
* Document: <testng classfileset> doesn't add to classpath
* Document org.testng.Reporter in the HTML report (screenshot ready)
* Document parameter logging
* JavaDoc for org.testng.TestNG
* Document threadPoolSize
* Make sure it can run several times in the same JVM
* Implement invocationCount and successPercentage
* Support multiple testng.xml (TestNG allows it but not the reporters
yet)
* The HTML reporter collapses all the suites into one report, we should
create one HTML report per suite
* testng-failed.xml should contain the parameters of testng.xml (if any)
* Create a testng-failed.xml that includes dependent methods
* Generic reported with compare(ITestResult, ITestResult) for
easier reporter for "slowest method first" or generate testng-failed.xml
* Iterator factories
* configuration methods don't respect inheritance
- build.xml should issue a clear error message if trying to build with JDK1.4
- Implement <tasdkdef resource="testnganttasks"> so we can define
ant tasks for TestNG and JUnitConverter automatically
- Write documentation to declare ant task in section 3.2.8
- Documentation for alwaysRun
- Allow to specify packages or prefix in the <classes> tag:
<classes prefix="com.beust.testng"><class name="A"><class name="B"> />
- New assert classes
- New ways to launch
- JUnitConverter documentation
- new beforeSuite/afterSuite
* in testng-failures.xml include the beforeSuite/afterSuite methods (very tricky)
* Provide log.properties configuration (not using log any more)
* Make @ExpectedExceptions fail if no exception is thrown
* Make timeOut() work in non-parallel mode (the default mode needs to become
parallel=true thread-count=1
* The exception thrown when a test passes with a @ExpectedExceptions is not
available via the TestNG API: ITestResult.getThrowable().
* Add assert API for arrays and collections (undecided yet: partial asserting)
* dependsOnMethods
Allow to specify <groups> at the <suite> level
Make TestNG run on class files and not just on testng.xml
Make TestNG run on a jar file that has a testng.xml file in its root or just on all
the classes inside that jar file.
Implement parameter passing of tests: define a property in the properties
file and pass it to the test method:
@Test(params = { "${fn}", "${ln}" }
public void testNames(String firstName, String lastName) {
}
Run groups of groups
List all tests that will be run, or show methods per group
HTML generation
Make test and class methods discoverable
JUnit adapter
Multiple ExpectedException
Inheritance
Test listeners
Group regexps for launching
====
A new comment has been posted on your blog Otaku, Cedric's weblog, on entry
#149 (The poor shape of Unit Testing).
http://beust.com/weblog/archives/000149.html
IP Address: 68.72.49.189
Name: Curt Cox
Email Address: curtcox@gmail.com
URL:
Comments:
For whatever its worth, here are my problems with JUnit. I've largely developed work-arounds.
1. Interface-based testing is awkward.
2. Only fast-fail tests are supported.
3. The design is brittle, poorly documented, and thus hard to extend.
How should interface-based testing be handled in TestNG? A nice feature would be bundled tests for interfaces in the java* namespace that are automatically applied. In other words:
- all classes will fail a (supressible) test if they violate the contract for equals() and hashCode()
- classes that implement Comparable will fail a test if they fail to implement Comparable
- and likewise for java.io.Serializable, java.util.Map, java.util.List, etc...
When I say all tests are fast-fail in JUnit, what I mean is that the first failed assertion in a method causes the test to exit. While this is usually desirable, a more conversational style of tests can sometimes be much easier to read and write. Such a conversational test doesn't generate a simple failure, but rather a score. The score could be either x of y passed or x percent passed. The important part is that the first failure doesn't terminate the test.
As I said, I've largely developed work-arounds for doing these in JUnit, but developing tools for conversational tests that play nice with the various JUnit runners was a real challenge. The exact contract that Eclipse expects of JUnit tests turns out to be different than what either of the bundled runners or the Ant task expect. Anyone who considers either the JUnit code or interfaces well-documented has a much different concept of well-documented than I do.