In this project, you will implement a secure log to describe the state of an art gallery: the guests and employees who have entered and left, and persons that are in rooms. The log will be used by two programs. One program,
logappend, will append new information to this file, and the other,
logread, will read from the file and display the state of the art gallery according to a given query over the log. Both programs will use an authentication token, supplied as a command-line argument, to authenticate each other. Specifications for these two programs and the security model are described in more detail below.
You will build the most secure implementation you can; then you will have the opportunity to attack other teams’ implementations.
You will work in teams of 2-3 people in groups assigned by us. Your first task is to choose a team name and email it to our TA, Aymeric Fromherz. Make sure you include both your team name and the names all team members in your writeups.
All implementations must be written in C, and we will provide you with some basic starting code and a makefile. Feel free to make whatever changes you feel are necessary, as long as you stay within the rules outlined below. You don’t need to feel obliged to keep the starter code the way we gave it to you; it’s just there as one suggested way of organizing your code.
This project will operate using infrastructure from the Build it, Break it, Fix it contest developed at UMD. Please note that your grade in for the project and your score in the contest are not the same (although they are likely to be correlated). Details of grading and scoring are below. Scoring well in the contest is good for bragging rights and for extra credit.
Each student must register at our contest website (which requires an on-campus IP address) and then affiliate with your team. Choose a password for the contest website that is not the same as the one you use for your regular CMU login. The team leader can then create the team, invite the other team members, and register the team for the contest.
Your team will design a log format and implement both
logread to use it. Each program’s description is linked below.
The logappend program appends data about activity in the museum to a log.
The logread program reads and queries data from the log
logread contains a number of features that are optional. If you do not
implement an optional feature, be sure to print
unimplemented to standard output.
Look at the page of examples for examples of using the logappend and logread tools together.
The system as a whole must guarantee the privacy and integrity of the log in the presence of an adversary that does not know the authentication token. This token is used by both the logappend and logread tools, specified on the command line. Without knowledge of the token an attacker should not be able to:
logreador otherwise learn facts about the names of guests, employees, room numbers, or times by inspecting the log itself.
logappendinto accepting a bogus file. In particular, modifications made to the log by means other than correct use of
logappendshould be detected by (subsequent calls to)
logappendwhen the correct token is supplied.
An oracle reference implementation is provided to demonstrate the expected output of a series of commands run on
logappend. Contestants may run the reference implementation by going to the team participation page on the website and clicking on “Oracle Submissions”. Here is an example of the expected input format for the oracle, structured as a series of command-line calls within brackets:
[ "logappend -T 1 -K secret -A -E Fred log1", "logappend -T 2 -K secret -A -G Jill log1", "logappend -T 3 -K secret -A -E Fred -R 1 log1", "logappend -T 4 -K secret -A -G Jill -R 1 log1", "logread -K secret -S log1" ]
Each team should initialize a git repository on the ECE GitLab and share it with the 732s18-grading user. You must grant the 732s18-grading user at least reporter permissions. You MUST NOT make your repository public; doing so will be treated as an academic integrity violation.
Create a directory named
build in the top-level directory of your repository and commit your code into that folder. Your submission will be scored after every push to the repository. You may keep your code other directories to keep it from being scored.
To score a submission, an automated system will invoke
make in the
build directory of your submission. The only requirements on
make are that it must function without internet connectivity, it must return within ten minutes, and it must build from source (committing binaries only is not acceptable). We will provide a sample makefile.
logappend should be executable files within the build directory. An automated system will invoke them with a variety of options and measure their responses. The executables must be able to be run from any working directory.
To ensure you’re making progress on the assignment, we will have a checkpoint deadline.
By this deadline, your team must achieve at least 10 points on the contest scoreboard. You can then continue to refine your system until it’s time for the final submission described below.
Via Canvas, you should submit:
We will assign you implementations developed by four other teams. Your goal will be to find and validate bugs and vulnerabilities in that code according to the Break-It specifications.
Via Canvas, you should submit:
For points in the contest, you are welcome to also look at other implementations beyond the three you have been assigned, but you can only submit at most 5 breaks per implementation, and your breaks can only target at most 8 implementations.
If one of your assigned implementations doesn’t work well enough to analyze, please request an alternate implementation from an instructor.
You will be given access to all breaks against your initial implementation, and you are now responsible for fixing them. For each bug, submit a small patch that fixes it. Each patch should only fix one bug (though fixing that one bug may fix many break reports).
You should also submit a fix-it report that should organize your fixes into categories (e.g., these three fixes address buffer overflows). Explain how you developed the fixes, and your level of confidence that these fixes address all such bugs in your program.
Part A: Build It will be worth 100 points
Part B: Break It will be worth 100 points
Part C: Fix It will be worth 50 points
The assignment as a whole is worth 250 points. The top 50% of teams, based on contest score, will be awarded an additional 10 points (4%). The top 10% of teams will be awarded an additional 15 points for a total of 25 (10%). The very top team will receive adulation and eternal bragging rights.
1.01: Clarified points received at checkpoint.