Cosmic Ray e-Lab Fellows, Center Group
Submitted by rspete
on Monday, October 8, 2012 - 10:46
Workbench space for e-Lab Fellows to create, edit, and store workshop resources and documents.
Description
library for workshop documents
Cosmic Ray Telescope TRACKING plan 20-May-2017
Cosmic Ray Workshop Agenda
“Our mission is to create opportunities for teachers and students to explore the nature of scientific research. Using cosmic ray muon detectors, participants design and perform experiments to analyze data employing the cosmic ray e-lab.”
Learning objectives.
a. Configure a cosmic ray detector appropriately for acquisition of data for calibration and analysis of measurements.
b. Identify and describe the e-Lab tools available for conducting studies with data collected using a cosmic ray detector.
c. Create, organize and interpret a data plot to make a claim based on evidence; provide reasoning and identify data limitations.
d. Develop a plan for taking students from their current level of data use to subsequent levels using activities and/or ideas from the workshop.
Pre-Workshop notes:
-- needs assessment including communicating with mentor and/or lead teacher
-- check GPS location
-- check room
-- check projector and cables for presentations
-- print workshop evaluation
-- gather tools & supplies - TAPE! TAPE! TAPE!
-- optional: black light --> scintillator
-- create e-Lab and Quarknet accounts for all teachers
--- have data collection computers with EQUIP installed and JAVA updated
1st Day:
Welcome/Intros and mission statement and enduring understanding
Sign-in:
Teachers must sign in each day. The sign-in sheet should, minimally, have spaces for the name and signature of each teacher as well as time in and time out. Send the completed sign-in sheets to Anne Zakas. Originals are preferred; mail to:
Anne Zakas
QuarkNet Center at Notre Dame
Department of Physics
225 Nieuwland Science Hall
Notre Dame IN 46556
Needs assessment, norms, goals and learning objectives
Logistics: schedule, meals, parking lot for questions/comments, etc
Cosmic Ray science lecture <-- CR scientist
QuarkNet overview: "The Purpose of Quarknet" <-- QN staff
Cosmic Ray Mission discussion
Cosmic Ray Muon Detector hardware components
Assemble CRMD: assist by QN staff
Four teams --> assemble four counters
One team --> plan DAQ/GPS placement
Plan study type: shower array or stacked array?
Measure GEOMETRY
Data-taking: 'EQUIP' procedure
Reflection and Discussion of the day’s activities
Take CR data: overnight
2nd day:
Review of previous day’s activities
Cosmic Ray e-Lab overview
e-Lab login: verify accounts
Manage accounts: create student groups, change password
Tour e-Lab: Teacher side; Student side
Teacher: Learning Objectives, Community
Student: Cool Science, Library, Upload, Data, Posters
CRMD data UPLOAD: overnight data
PERFORMANCE analysis tool
-- Assess data quality and discuss need for plateauing.
Load GEOMETRY
FLUX, SHOWER, Lifetime, and Time of Flight analysis tools
Team CR investigation: FLUX, SHOWER --> write shared POSTER
If 2 day only:
POSTER presentations
Implementation Plan: classroom approach
Future plans for group: CRMD data-taking, coordinate research, CRMD rotation schedule
Wrap up
3rd day:
Review of previous day’s activities
Work time: Teams
Implementation Plan: classroom approach
POSTER presentations
Future plans for group: CRMD data-taking, coordinate research, CRMD rotation schedule.
Surveys:
- Teacher Implementation Survey - https://www.surveymonkey.com/r/NTWHZF5 - http://tiny.cc/qnis17
- to be completed by ALL teachers of all summer programs, workshops, institutes etc. at any center
- complete before end of summer workshop
- takes 20-30 min
- Participant Satisfaction Survey - https://www.surveymonkey.com/r/NV726DM - http://tiny.cc/qnps17
- to be completed ONLY by participants at QuarkNet-provided workshops facilitated by QuarkNet staff or fellows
- complete at end of workshop
- takes about 10 minutes
Contacts:
<faciliator>
<mentor
Improvements to CR e-Lab
We propose a meeting at FNAL to discuss the e-Labs (31 March- 2 April). While most of our concerns are about Cosmic, we may also discuss other e-Labs for a limited time.
Our experience with phone- and video-conferences is limited to one- or two-hour events. Even these are sometimes hard to maintain connections and focus with all attendees. This meeting will take place over three days. We feel that this duration is difficult with participants in different places.
We suggest the following attendess: Bob, Liz, Edit, Tom. Ken might also wish to come for the Cosmic discussion. We can set-up a two-hour conference call with Tom McCauley and Dale Ingram near the end of our time.
DIscussing these issues will help us to decide how best to spend our limited resources. We have been squashing bugs and putting out fires for too long. It's time to look at the entire system critically and decide what we can afford to address and what we have to leave as it is.
JUST IN: We might need to take into account that we will be working with Drupal....
Abbreviated List (March 26)
Who is watching the servers to be sure that everything is up. Who can they call if something goes astray?
Daily server observation
What does Edit do during in the background to keep an eye on the servers? Is nagios enough?
- Concurrency testing
- Load balancing (priority queue)
- Data uploads
- Remove bottlenecks in searching
- Bugs
- Feature requests
Original Long List
Topics for discussion include:
Performance of the Cosmic e-Lab
The e-Lab is showing its age. It falls over too often in workshop or classroom settings. We need to explore ways to address this. We have just rolled out a big change in analysis workflows that should help with performance issues. We plan to test this recent update, but need to also discuss other areas where we can improve performance. These inlcude:
- re-plotting
- We currently re-run an analysis if the user wants to do something as re-scale the axes. This is a waste of compute cycles. The bless plots allow for redoing axes just by clicking on the plot. We should add that feature to plots from the analyses as well.
- EP: This could take some time because we need to change the graphics tool from svg to flot. That involves changing the perl code, possibly the swift scripts and new coding for flot, json and gson.
- We currently re-run an analysis if the user wants to do something as re-scale the axes. This is a waste of compute cycles. The bless plots allow for redoing axes just by clicking on the plot. We should add that feature to plots from the analyses as well.
- Java vs. Perl
- All of the analysis workflows rely on aged Perl scripts. We moved ThresholdTimes to Java and realized a big performance boost. Is it worthwhile to consider re-coding the rest of the workflows in Java?
- EP: That could be done but it will take some time to convert from the Perl to the Java plus testing to make sure that all works well (including rounding). Also we would need to modify swift scripts to call java rather than perl (it might be easy, not sure).
- All of the analysis workflows rely on aged Perl scripts. We moved ThresholdTimes to Java and realized a big performance boost. Is it worthwhile to consider re-coding the rest of the workflows in Java?
- Concurrency
- The e-Lab gets very unresponsive when there are multiple users. This problem may have improved with the persistence of the .thresh files. We should still consider testing before the summer workshops start.
- We might try running new stress tests with multiple users and see if this is still the problem. The last couple of reports that I received about the server "hanging" were due to other reasons (still investigating that but not due to multiple users/analysis running).
- The e-Lab gets very unresponsive when there are multiple users. This problem may have improved with the persistence of the .thresh files. We should still consider testing before the summer workshops start.
- Load-balancing of jobs
- Is it possible to make a queue that will manage jobs in order to reduce pile-up?
- EP: I am working with this and we will soon have a queue for testing :)
- Is it possible to make a queue that will manage jobs in order to reduce pile-up?
- problems with scaleability - too many data files plus a more complicated search with blessing are bogging down searches (do we need to keep all of these files? Can we simplify what we search for?)
- We need to discuss this in a telecon and come up with ideas by looking at the problem areas (cosmic search data).
- review our infrastructure (vdc, physical files, etc).
- We probably need to involve other team members to discuss this issue (Mihael, Mike W.).
- Workshop use is limited because response time is so long when 15 students make simultaneous requests. Improvements in multiple job submission or a list of conditions that allow 10 simultaneous jobs would be helpful.
- We need to test this again.
- Data uploads
- Is there any way to improve the speed of data uploads? They take so long that something must be wrong on the server side. Tom sent mail about this on 19 Feb.
- We need to find tools to test what causes the delay. Uploading from the user machine to the server can some time depending on the size of the file, the internet connection (I guess), browsers (?), etc.
- Is there any way to improve the speed of data uploads? They take so long that something must be wrong on the server side. Tom sent mail about this on 19 Feb.
- Shower interface
- When doing shower analyses users should be able to look for nearby detectors. The current interface gives them no information about separation. Users could select a detector at Fermilab and one in Japan. We've talked about using a map and a circle, but it doesn't have to be that fancy.
- This problem could be solved now without coding for a tool.
- When doing shower analyses users should be able to look for nearby detectors. The current interface gives them no information about separation. Users could select a detector at Fermilab and one in Japan. We've talked about using a map and a circle, but it doesn't have to be that fancy.
- Can we use mashups to make 2014 UI? Could this improve how we display logbooks, milestones and references?
- Needs to be well-thought out. Not difficult to implement but time consuming because you have to work with the whole framework.
- A better interface for the registration and management of users
- This functionality needs a major rewrite. Not difficult, time consuming.
Data mismatch
- Location mismatch
- if a teacher has an account at a school in Wisconsin and uploads data, the data looks like it is in Wisconsin. If that teacher takes the detector somewhere else (e.g., the South Pole) and uploads data, the data look like they are in Wisconsin.
- Add documentation for teachers to get new accounts if they move schools or want to take data far away from their school
- Getting a new account is how we currently do this. It's a workaround and puts the burden on the user. The difficulty comes from the metadata for the location of the uploaded data comes from the teacher's loging. The metadata could also come from the geometry of the detector. This is a much bigger solutio and harder to code--maybe it's "pie in the sky," but it solves, rather than works-around, the problem. Doing location this way also makes it easier to change the shower interface.
- if a teacher has an account at a school in Wisconsin and uploads data, the data looks like it is in Wisconsin. If that teacher takes the detector somewhere else (e.g., the South Pole) and uploads data, the data look like they are in Wisconsin.
Keeping up with new versions of system level software
We need to keep on the web treadmill as our software stack ages.
- Java 6 vs 7 / Tomcat 6 vs 7
- Already working with this.
- We need to upgrade. This includes setting up the environment and make sure our existing code migrates OK
Unit Testing
- The webapp lacked unit testing: code that is used to determine that each part of a program is working correctly. These tests are written at the same time the code is written to prevent buggy code and later to make sure that the code is still working with updates, new additions to the code, etc. Started adding thse but it is still a working in progress.
- Also, we need to add some automatic testing: the deployment of the application has presented some problems a few times. It seems that there is a race of threads at some point and the compilation of some classes is not 100%. A new deployment usually fixes this but we need some automated tools to tell us whether the rollout is 100% instead of manually testing the website.
Better Administrative Tools
- More monitoring tools, etc.
- Replicate command line ways of retrieving information from the server
- Make sure that the statistics we are gathering are correct (e.g., number of pretests
- A tool like NGOP at Fermilab that checks whether the site is up. It emails or calls a telephone if the site is down. Argonne must have this.
- What can we use to store all the information about the system?
- University of Chicago Developer's wiki vs Drupal.
- Do we need more documentation on the analyses and how they work?
- Should we update our screencasts or add more? (e.g., screencast of data blessing on upload)
- Update the references associated with milestones.
- Inspect the Tomcat log for feedback (analyses, cms event display, interactions with the database).
- Look at trends in the number of logbook entries, pre and post-tests entries and other items in the e-Lab over time.
- Use a tool like Google analytics to see what the users are doing (logbook, milestones, references, etc.?)
- Generate a survey to learn what the users (and not users) are doing.
Marketing our e-Labs
- How can we retain and increase our audience?
- Make NextGen more prominent.
Examination of the original assumptions and goals of project, how we have succeeded and whether they are still relevant.
- (e.g., sharing data especially for shower studies, providing analysis tools to schools)
- Is a shower study using multi-school data practical with our tools? Do we need more functionality and visualization?
- Do we need the tight coupling between milestones and the logbook?How does the Purdue Java tool influence what should be in the e-Lab?
What new technologies might replace what we have implemented
- Should we be replacing log books with some open source solution.
- Should we be using Cloud Computing?
- Should we still be using Tomcat and Java Server Pages?
Getting rid of wiki dependency
Handling glossary, etc. consistency in all e-Labs
Should we continue to try to support and develop the CMS and LIGO e-Lab?
Yes. Next question.
Hardware Needs
Feature Requests
- allow user to superimpose multiple data sets on one plot, i.e., Barometric Pressure vs. Flux.
- histograms of times between counters "i" and "j" (6 possible 2-fold combinations) Aplications in order of usefulness: speed of muon; performance studies; calibration of relative counter timing for use in shower reconstruction and for estimates of intime counter hits versus random backgrounds.
- Counter multiplicity and logic requirements for any analysis. Install ability to require counter a specific set of counters, e.g. Counters 1 and 3 but not counter 4, within a user-specified time window. This is critical in lifetime studies. If relative times from suggestion 1 were available per event, that would also enhance muon lifetime measurements.
QN Cloud Chamber
Cosmic Workshop Resources
- Generic e-Lab PPT presentation script for Summer 2012 Cosmic Ray e-Lab workshops
-
Punch List for generic CRMD & Cosmic e-Lab workshop, May 2012. This describes all the pre-steps to get ready for an e-Lab workshop. This should be sent to the Mentor or Lead Teacher well in advance so they can gather resources.
- QuarkNet e-Lab Fellows POST-WORKSHOP REPORTING FORM, May 2012
- QuarkNet e-Lab workshop PARTICIPANT EVALUATION FORM, May 2012
- QuarkNet Cosmic Ray Muon Detector Workshop Plan for 2 day Workshop
- QuarkNet Cosmic Ray Muon Detector Workshop Plan for 1.5 and 2 day Workshop
- QuarkNet Cosmic Ray Muon Detector Workshop Plan for a 5 day Workshop
- Example of a short e-Lab tour NOT MEANT TO REPLACE FULL e-Lab WORKSHOP!
e-Lab Fellows
Nathan Unterman: nunterman@glenbrook225.org; nunterman@gmail.com
Jeff Rodriguez: jeffrey.rodriguez@foresthills.edu
Martin Shaffer: SHAFFERM@cowley.edu
Kevin Martz: kevinmartz2009@gmail.com
Rose Emanuel: astroemanuel@gmail.com
Robert Franckowiak: rgntna9@yahoo.com
Elisa Gatz: egatz@sps5.org
Mark Adams: adams@fnal.gov
CRMD User Agreement, Assembly Instructions, Users Manual
1. CRMD USER AGREEMENT
Current agreement necessary for receiving a new "6000" CRMD following the next production cycle: CRMD User Agreement, V2.0.
2. CRMD ASSEMBLY INSTRUCTIONS
- Series "6000" CRMD Assembly Instructions, Version 2.13: 6000 CRMD Assembly Instructions
- Older version of the Series "5000" CRMD Assembly Instructions, Version 1.3: 5000 CRMD Assembly Instructions
3. CRMD USERS MANUAL
- Series "6000" CRMD User's Manual, January, 2010, Version 1.1
- Series "5000", "200" CRMD: This is the older 2004 version of the CRMD Users Manual.
- Drafts and outlines of a proposed, online CRMD Users Manual. Go here. Needs Fellows input.
e-Lab Fellows Working Documents
1. PLATEAUING DETECTORS
"6000" CRMD Counters, Current version: Nov 2011
Remade from previous versions; tested by Jeremy Paschke and Martin Shaffer, Summer 2009
Students will require both following files to plateau.
6000 HOWTO Plateauing Powerpoint
6000 Plateauing Spreadsheet Form
Older, out-of-date versions:
-
From Jeremy Paschke, Summer 2008
These two documents include edits and remakes of Fellows documents listed above. Fixes fundamental errors. Thanks to Jeremy. Fellows, please review and add edits. When all changes have been received, these will be moved into the Cosmic e-Lab proper for public use.
HOWTO Plateauing Powerpoint: How-to-Plateau-2008JP
Plateauing Spreadsheet Form: Plateau-Template-2008JP
- From Fellows, Summer 2007
We've got a nice powerpoint on connecting all the equipment and plateauing the counters. The level is intended for beginning students or teachers and is designed to march through the procedure as time efficiently as possible, so that anyone can start to do real science asap.
HOWTO Plateauing Powerpoint: How+to+Plateau
There's an excel spreadsheet template associated with these directions. The powerpoint has a link to the spreadsheet, but you may have to open it by yourself if you download from here.
Plateauing Spreadsheet Form: Plateau+Template
- For completeness; plateauing instructions contributed from others; some out of date
Notre Dame: Counter_efficiency_ND.doc
NorthVA/Hampton: VOLTAGE_OPTIMIZATION_VA.doc ; HU_Plateauing_Spreadsheet_VA.xls
Mark Adams, UIC: MAdamsRevised_plateau.doc
"5000" series Cookbook; out of date: Plateau_cookbook.doc
2. RUNNING A WORKSHOP
This file was made during the leadership session at Fermi Lab on Sunday, 7/15. Please improve and delete this disclaimer.
This file is the one created by Cheryl and her group during the workshop.
3. PLANNING AN EXPERIMENT
This is an Excel spreadsheet template for calculating how long you might need to run an experiment to see statistically significant differences in count rates.
You enter your expected baseline count rate, and your expected difference between counts under different conditions. The ss calculates total counts during increasing numbers of seconds, and the uncertainty in that total count number. As you'll see, the smaller the expected difference, the more counts you need to collect, and the longer the detector needs to run.
We used this for an experiment in a 15 story steel & concrete building. We used 2 counters set to count when a 2 way coincidence occurred. The counters were separated by about 60 cm vertical distance. We planned to take counts on the 15th floor, 8th floor, and the basement. We expected, based on just a hunch, that the building would have very little effect on counts. We decided to take enough counts to demonstrate a real difference at a 1% level. This spreadsheet would calculate about an hour of counting time to collect enough counts to prove a difference at a 1% level.
By the way, the actual difference was about a 2/3 reduction in counts as the detector moved from top to bottom of the building.
Here's the Excel spreadsheet for planning an experiment: Planning an Experiment
4. BLESSING PROPOSAL DISCUSSION
Proposal for Blessing data, July 2007.
Article from Auger group on Blessing data.
5. ESTIMATING FLUX RATES
Muon Flux rates are strongly affected by: Scintillator surface area Orientation Spacing and solid angle acceptance Intervening absorber material and thickness Altitude
A simple model is presented to allow estimating flux rates due to area, spacing, and absorber.
If you would like to know more about how this functions you may want to read the following notes.
This model uses the following rules-of-thumb: Sea Level muon spectrum is flat with a slope of dN/dE = 0.004/(GeV cm2 s sr) Energy absorption is proportional to "range" X = density x thickness. This has units of g/cm2 or since water has a density of 1 g/cm3, one can use the unit of "cm W.E." or cm of water equivalent. Most materials absorb muon energy at a rate of dE/dX = 2 MeV / cm W.E. Incoming muon flux drops off as the cosine squared of the angle from zenith
Muon flux accepted by the detector will depend on the spacing between the detector and the cos square drop and is assumed for vertical orientation.
6. TIMING FROM RAW DATA
A brief attempt at explaining a time extraction from raw data.
QDAQ.exe
A windows-based program has been developed that will allow one to review a raw text data file, as well as to translate it into a form that can be analyzed using Excel.
Features
1 Browse the input data by record or event. Translates each field and display meaning.
2 Compile statistics over the entire file.
3 Output Events File for analysis using Excel. One record for every event in data file.
4 Output Lifetimes files for analysis using Excel. One record for every pair of channel triggers within an event that occur within a user-specified range of time.
A copy of this program may be downloaded.