Please enable JavaScript.
Coggle requires JavaScript to display documents.
Computer Problem Solving (4.13) - Coggle Diagram
Computer
Problem Solving
(4.13)
System
Development
System Development Life Cycle
main stages of system development, diff. approaches
Iterative approach - stages combined revisited during project
Life Cycle = eventually technology/business requirements
makes a system obsolete, cycle begins again
Analysis
Define it
Understanding and defining user requirements
Carried out by
systems analyst
- good comm. skills,
trained in techniques for thorough investigate, documenting
findings (business expertise not necessary); work very closely with client to fully understand requirements
Outputs
Problem definition - clear definition of problem, domain
System objectives - clear description of system's purpose SMART - Specific, Measurable, Achievable,
Realistic, Time-bound
Feasibility Study
Precedes analysis stage, most senior executives
will want to see results before they sanction project
expenditure: hi-level look at key issues, risks, will
inc. recommendation as to whether to do project
Design
Define it
System architects oversee the design process,
drawing on spec. skills, expertise from user interface
designers, database designers, security experts
leads to design specification
Outputs
User Interface; System outputs (reports, graphs, emails);
Algorithms; Data Structures; Security Features
Build
Define it
Most Dev's tackled by teams of programmers
Modular Design - teams for user interface, back-end
processing, databases, functional testing etc.
Org's can have style guides/in-house conventions
Version Control
- changes can be rolled back/amalgamated
Major changes (& reasons) recorded, document dev process
Project Manager
- oversees dev, w/ iterative approach
(employing prototyping/agile approach) there'll be focus
on resolving tasks critical to success
Output
Technical Documentation
- explains how system works
will be needed by those who maintain system in future
Many tools that allow technical documentation to be generated automatically; good coding practice makes systems self-documenting
User Documentation
- may be required for parts of system that are complex, or if client has no. of inexperienced
users
Testing
Different Approaches
Bottom-up Testing
- each small module
is tested as it is developed and modules are
amalgamated only when they're bug-free
Top-down Testing
- focus less on functionality,
more on structure, often an appropriate approach
to testing development of user interface 'look'
tested w/o worrying about functionality
White Box Testing
(aka structural testing) - checks all of the pathways
through code so it works in every circumstance
white box since tester needs to look in detail at code,
using carefully designed test data (boundary, erroneous, normal) check for correct output, each test must illuminate
something new and not just duplicate insights gained from
other tests
Black Box Testing
Focussed on functionality; check system does what it's supposed to do w/o checking code, done in reference
to requirements specification. Test data entered through user
interface, output viewed that way
System Testing
Testing system as a whole, not discrete modules
Involves planning data journey, start to finish
Stress Testing
(aka performance testing) make sure system can handle
anticipated volume of users/data (e.g. MMOs)
Penetration Testing
Testing to make sure system is secure
from hackers/other malicious attacks
Alpha Testing
Final testing by dev team(/associates) using
documented test plans w/ focus on whole system
functionality & usability
Beta Testing
Unstructured testing by a range of selected end users
Allows testing on wide range of hardware, diverse set of
supporting enabling software (operating systems,
browsers etc.)
Acceptance Testing
Final Testing w/ intended user(s)
Often use scenarios users follow to show
system meets requirements specification
Test Data
Test Plans
Important to specify precise nature of tests
to be carried out; specifies purpose of tests,
choice of test data, expected results
Normal/Typical
Values you expect a user to enter
Erroneous Data
Values that should be rejected (if entered)
Boundary/Extreme Data
Values at the limit of validity, some instances check
inside boundary, some instance check just outside
where the value is not accepted; 'extreme' name
alternative refers to 'extremely large values', possibly
in systems handling extreme value as some built-in
data types may not store these values accurately
Fixing Problems
Problems found need to be fixed, tests repeated
Cycle repeated until functioning system
Automating Testing
Tools available to automate various types of testing
Separate programs written to carry out complex
test series.
Test Platforms
Dijkstra - Dutch Software Engineer observed
"program
testing can be used to show presence of bugs, but never
show absence"
bugs will always show in complex modern
software, new updates put on test platform before 'going live'
(version of system for tests w/o impacting live system
Evaluation
Define it
Functionality
- does it do what it's purpose?
Effectiveness
- how well does it perform?
Usability
- is it intuitive? appropriate for its intended user?
Reliability
- how robust is the system?
Maintainability
- how easy is it to fix problems?
Extendibility
- ease to add new functionality?