18-649 Project 7
Design Process Cleanup and Midterm Acceptance Testing
Please submit all project-related correspondence to
- Fixed statement about leveling: You MUST be able to pass the acceptance test with leveling enabled.
- There is a simulator flag to disable it that you may use for debugging purposes only.
Throughout this semester your group has designed, implemented, and
tested a basic elevator system. Before you proceed
with optimization, you will make sure the entire design package is 1)
organized well and 2) consistent with your current
implementation. Your design will also need to pass all
previous tests and the additional tests provided for this
project. If you have been following the process in the prior
design stages, this
project should be pretty straightforward.
Note that this project is more heavily weighted than previous projects.
What you need to do:
1) Make sure you have the latest version of the simulation framework
from the download page.
about control periods:
The control periods for controllers instantiated during acceptance
tests are defined in MessageDictionary.java. For this project,
following control periods are required.
the simulator, so if
you have not changed them, you should be fine.
We believe these control periods are reasonable. If you wish to
the control period used by any controller, you MUST obtain the approval
of the course staff. In order to convince us, you will need
make an argument based the timing of the physical system.
makes my elevator work" is not a valid argument!
- HallButtonControl: 100ms
- CarButtonControl: 100ms
- LanternControl: 200ms
- CarPositionControl: 50ms
- Dispatcher: 50ms
- DoorControl: 10ms
- DriveControl: 10ms
If you obtain TA approval, you must write up
the justification for the change and include it in the description of
the relevant control object(s) in the Elevator Control package (elevatorcontrol/package.html)
in your portfolio. You should also record the name of the TA
that approved the change. If you either a) do not obtain TA
b) do not note the changes and the justification for the changes in
your portfolio, significant
points will be deducted.
2) Organize your design portfolio.
Make sure that your portfolio documents
conform to the structure and guidelines described in the projects and
the portfolio page.
guidelines, you should have very little work to do here.
3) Complete your integration tests,
traceability, summary files, etc).
4) Peer review your integration tests.
You should peer review at least 4 of the newly created
integration tests for this project. Although you are only required to
review 4, we highly encourage you to do more to ensure testing coverage.
- If you have more than 20 sequence diagrams, you need only create
tests for 20 of them. However you MUST test the original
scenarios provided in the portfolio template (1A, 1B, 1C, 2A, 2B, 3A,
4A, 5A, 5B, 6, 7A, 7B, 7C, 8A, 9A)
- If you have fewer than 20 sequence diagrams, you must test them
- You must pass all integration tests.
5) Ensure your design portfolio is
complete and consistent. If you have been keeping up with
updates in the previous project stages, most of this work will already
be done. The following is a partial list of the characteristics your
portfolio should exhibit:
6) Use Runtime Monitoring to Study
- Problems identified in previous assignments have been corrected.
- All documents are complete and up-to-date with latest
time-triggered implementation (i.e., code matches statecharts,
requirements, sequence diagrams, and all traceability is complete and
up to date)
- All documents include group # and member names at the top of the
document. (This includes code, where this information should
appear in the header field)
- Individual documents have a uniform appearance (i.e., don't look
like they were written by 4 individual people and then pieced together)
- Code is commented sufficiently and reflects code-statechart
- Issue log is up-to-date and detailed enough to
track changes to the project.
For this part of the project, you will use the runtime monitoring
framework to study the performance of your elevator. Later on, in
Project 11, you will use the runtime monitoring to verify that your
final design has met the high level requirements.
First, read the description of the runtime monitoring framework in the Runtime Monitoring Overview.
Then use the SamplePerformanceMonitor.java as a starting place to
create your own runtime monitor. Add a new class to the
called Proj7RuntimeMonitor. Be sure you use the right name
because the monitor may be graded by an automated script that relies on
this name. Make sure Proj7RuntimeMonitor extends
Your monitor shall record the following
Make sure you implement the summarize() method to report these results
at the end of the test.
- How many times the elevator
became overweight - Specifically, count the number of stops
where the car became overweight
at least once. If the doors close completely and reopen at the
floor, that counts as two overweight instances. Code to meet this
requirement is provided in SamplePerformanceMonitor.
- How many wasted openings -
a door opens but there is no call at
that floor. Depending on your design, you may need to use button
presses, light indicators, or changes in car weight to determine
whether or not an opening was wasted.
- How much time was spend dealing
with door reversals - count
the time from a reversal until the doors fully close, and accumulate
this value over the whole test. Hint: check out the
in SamplePerformanceMonitor. Note: If you are only using NUDGE for doors, you
will not see any time spent dealing with door reversals. This is ok, but you must
implement this anyway as it will be useful in later projects.
You must also complete a peer review of your runtime monitor and add it to the
peer review log
You are not required to improve your design based on these metrics,
only to measure the performance of your current design. However,
should get you thinking about ways to improve your design and the
performance of your elevator.
If your monitor does
more than monitor the system (e.g. outputs any framework or network
messages, or affects the simulation state in any way, you will receive
significant deductions on the assignment.
7) Pass an acceptance test.
Read the Acceptance
Requirements. Running an
acceptance test involves:
We have provided three acceptance tests:
- Run the acceptance test and observe the results of the test.
- Be sure to run your Proj7RuntimeMonitor during the test.
- Save the test input and output files in the acceptance_test/
folder of your portfolio
- Record the results of the test, including the monitoring results,
in the Acceptance Test Log.
- Add entries to the Issue Log
documenting any defects identified as a result of the test. Be
sure to include the issue log #'s in the Acceptance Test Log file as well.
to receive full credit for this assignment, you must run all three
tests with a random seed of 8675309 and document the test
results. You must pass proj7acceptance1.pass.
the purposes of the midsemester project handin. You
are not required to pass proj7acceptance2.pass
bonus points (see below). For any of the
three tests, if your elevator does not pass the test, you must document
the problem in the Acceptance Test Log and the Issue Log.
Note 1: You may notice
passengers not boarding/leaving your elevator. This is possibly because
you haven't correctly implemented the leveling speeds for the drive controller.
To handle situations such as cable slip when the elevator is overweight, you will
need to level with the doors open.
See the requirements for the Passenger and the DriveControl speeds for more information.
Note 2: If you do not
proj7acceptance1.pass by the
time you hand in Project 7, you MUST (eventually) pass
this test in order get a grade for this course. In that case,
contact the course staff to arrange a demonstration when you are ready.
Note 3: You are strongly
discouraged in this project to use exception handling. Starting in
Project 8, you will be forbidden to do so without course staff approval.
BONUS: pass additional
You can earn bonus points by also passing proj7acceptance2.pass
conditions with a
light-to-moderate passenger load. You must fully document the
test results and any associated bugs (as described in Part 4 above)
even if your elevator does not pass these tests. Note that you
will eventually be required to pass these tests in Project 8.
The bonus is substantial (1% of your total course grade), but the
requirements for getting the bonus are also substantial! In
order to be eligible for bonus points, you must:
If you do not meet all these criteria, you will not get bonus points
even if your elevator passes the bonus acceptance tests.
because we don't want you to ignore other parts of the project in order
to try and hack together an elevator that passes the additional
acceptance tests. We want to give you an incentive to submit a
complete design package. We also want to reward teams that have
put in a substantial effort on the project.
- Successfully pass (and document) all unit, integration, and
acceptance tests required by the project.
- Have a complete and consistent design portfolio (see the grading
rubric for more details)
- Turn the project in on time.
Handing In Results
Each team shall submit exactly one copy of the
Follow the handin instructions detailed in the
Project FAQ to submit your portfolio into the afs handin directory (/afs/ece/class/ece649/Public/handin/project7/group#/ontime/).
Be sure you follow the required format for the
directory structure of the portfolio and its location in the handin
Be sure to follow ALL the
portfolio guidelines detailed in the Portfolio Layout page.
Any submission that contains files with modification dates
after the project deadline will be considered late and subject to a
grade deduction (see course
policy page for more information).
Grading (135 Points + 10 bonus
This assignment counts as one team grade. If you choose to divide
the work, remember that you will be graded on the whole assignment.
grading rubric is given here. Grading will be as follows:
- 15 points - for
Proj7RuntimeMonitor that records the stated performance requirements.
- 15 points - for a design
portfolio that meets the portfolio layout requirements
- 20 points - peer reviews
for at least 4 of the integration tests and the runtime monitor
- 35 points - for complete
unit and integration testing and running three acceptance tests, and
- 45 points - for a
complete and consistent portfolio. Note that the grading criteria
are based on sampling. If we check one part of your portfolio and
find problems, chances are that there are problems in other parts as
- 5 points -
for an improvements log entry. If you
encountered any minor bugs that we haven't already addressed, please
mention them so we can fix them. If you have no suggestions, say so in
your entry for this project.
points - for passing 2
bonus acceptance tests (points only awarded if the rest of project has
been substantially completed)
Back to course home page