WinBuilder Test Plan
Posted 28 February 2010 - 09:14 PM
So, After being referred to the poasting Nuno created on the overall plan, I thought it would be a good idea to have a place to discuss testing.
SO, let the comments begin...
Posted 01 March 2010 - 02:52 AM
Much calmer here (for now ), I'm taking this opportunity to offer some thoughts on WB testing and my overall feeling about its development. People can take or leave what I say as I am really an outsider looking in and that's the way I like it; it helps me to stay objective and calm, able to see the big picture a bit clearer. I have nothing but respect for the developers (they know who they are) even though I may find the development going in directions I don't always understand (at the time). Regardless, I see an overall improvement in WB and I am grateful for those improvements and for the continued free use of this great tool.
I see WB somewhat as a 'diamond in the rough', takes some effort and hard work to make it shine but shine it does, brighter than all the rest and I'm a bit afraid of polishing it too much. If WB gets too easy to use, too 'packaged', too controlled, it may restrict the 'development' at the user end. This end product 'evolution' is probably, largely overlooked by the core developers but is really a large part of the pay off for the rest of us. I think there has to be both and they need to be somewhat separate; the more one focuses on 'core' development, new features, bug fixes etc. the harder it is to see the big picture; the end products (boot disks) usability, customization and addition, improvement of tools.
I'm not sure what I'm trying to say other than, let's lighten up a bit, stand back and see that the big picture is really pretty damn good! Sure, there's more efficient ways of doing things and of course we all want WB to keep getting better (it is) but let's not throw out the baby..., things are good and getting better every day; I don't know if a complete overhaul is necessary or even productive. There needs to be those that focus on bug fixes and core development but not at the expense of stability and usability for the rest of us. I find testing every new release of WB like re-inventing the wheel over and over again, I think the wheel works good enough to focus on building a better cart and eventually a car around it and that's what I focus my time on; the car is looking pretty nice from where I sit.
Posted 03 March 2010 - 10:47 PM
On the other hand, it wouldn't hurt to be a bit more formal in the testing process. For example, if (as stated in the other post on this (Post 101 on WB80) that LiveXP is a good "stress" test of the syntax and processing, then that should be the "benchmark" for release.
I have also seen some discussion of making sure we have enough "strange" (at least to me as an English version system and spaeker) characters in the paths and scripts, etc. Even things like a longer path (which causes the delete to fail on some of the cached files) are all good things to put in place as a check list before release of even an "alpha" build.
But I agree, we also need to find ways to better organize and support the projects that layer on top of this "diamond"...
Posted 14 March 2010 - 02:25 AM
We begin writing a script that tests file operations (filecopy, filedelete, filerename, filecreateblank, etc).
For each wb operation that is under stress, we should perform a series of tests that will output the result as successful or not.
For example, testing filecopy:
- check if it is working in very long directory names
- test if it is working with space on the name of folders
- test how it will react to the cases where no source file is found
- test the cases where it has no permissions to write on the target folder.
- test what happens if the target folder has no more disk space
- test overwrite of files (and possible switches)
- .... (more to be added)
My idea is that each script should test a specific wb operation to allow us add more test cases in the future.
It would be even better if we could somehow sum the result into a score.
For example, wb 081 passed with 92% of compatibility to the available tests to help us understand and pinpoint what is changing between versions.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users