Black Hills State University/SURF QuarkNet Center
Submitted by Anonymous (not verified)
on Monday, June 3, 2013 - 10:00
Teachers, students and physicists working together to explore high energy physics.
Description
A collaboration of teachers, students and physicists involved in inquiry-based, particle physics explorations.
Implementation of Workshop Ideas 2016
We are going to do.......
-
How can we implement this in our classroom?
-
Something something about the P waves giving students a better idea about the structure of the interior of the Earth.
-
Insert reference to: reflection, refraction, diffusion, different types of waves, wave mechanics, etc
-
Change of wave speed in various mediums
-
-
Use LIGO to discuss Gravity waves and extend that discussion to wave mechanics.
-
Light, its behavior, etc.
-
Spacetime
-
-
LIGO Data can also be used for independent research projects to get students involved.
-
I made this long ago: LIGO Activity
-
Can be used to introduce and practice basic scientific method research method(s).
-
Sometimes I like to throw deep ideas at the kids and make their heads hurt
-
Use given resources to describe how Gravity is described as a wave
-
Introduce what an interferometer is and how it is used at LIGO. Interferometers are used on spacecraft in the solar system.
-
Use lasers to demonstrate reflection, deflection
-
The LIGO and Cosmic e-labs lend themselves to studies the students come up with themselves. Could easily be used for extended studies.
-
Use in the discussion of orbiting binary stars, neutron stars, supernovas...
Seismic Waves Questions 1, 4, 5
By Deirdre Peck & Steve Gabriel
-
What are the types of seismic waves? How are they different from one another?
http://walrus.wr.usgs.gov/infobank/programs/html/school/moviepage/03.01.19.html
Geologists divide the seismic waves that travel through the Earth's interior into two basic types,
"primary" or "P-waves" and
"secondary" or "S-waves."
A P-wave is a compressional wave that makes the rock vibrate parallel to the direction of its movement.
Since it is a very fast wave traveling through rock at between four and seven kilometers per second, the P-wave is the first wave to arrive at a recording station following an earthquake.
This is also known as a longitudinal wave
An S-wave, on the other hand, has a shearing motion that makes the rock vibrate perpendicular to its path.
This movement slows the S-wave, so that it travels at two to five kilometers per second or about half the speed of the P-wave.
This is why S-waves arrive as secondary waves at the Earth's surface.
There is another important difference between P-waves and S-waves.
Although both can pass through solid rock, only P-waves can also pass through gases and liquids.
This is also known as a transverse wave
Wave animations found at:
http://www.acs.psu.edu/drussell/Demos/waves/wavemotion.html
-
What are sources of seismic waves? From Wikipedia
A seismic source is a device that generates controlled seismic energy used to perform both reflection and refraction seismic surveys. A seismic source can be simple, such as dynamite, or it can use more sophisticated technology, such as a specialized air gun. Seismic sources can provide single pulses or continuous sweeps of energy, generating seismic waves, which travel through a medium such as water or layers of rocks. Some of the waves then reflect and refract and are recorded by receivers, such as geophones or hydrophones.[1]
Seismic sources may be used to investigate shallow subsoil structure, for engineering site characterization, or to study deeper structures, either in the search for petroleum and mineral deposits, or to map subsurface faults or for other scientific investigations. The returning signals from the sources are detected by seismic sensors (geophones or hydrophones) in known locations relative to the position of the source. The recorded signals are then subjected to specialist processing and interpretation to yield comprehensible information about the subsurface.[2]
-
How are seismic waves measured?
Seismic waves are basically measured by keeping an instrument still and having the inertia of the earth movement record the on the instrument.
http://earthquake.usgs.gov/learn/topics/seismology/keeping_track.php
S waves are emanating vertically at the surface of the earth.
P waves would generate the horizontal waves on the surface
Seismic Waves Question 1, 2, 3
Describe seismic waves.
By
Zach Beam, Amanda Towry, Jim Stith
What are the types of seismic waves? How are they different from one another?
P Waves are compression waves (longitudinal waves) These waves are faster than S waves
S waves are transverse waves (perpendicular to the direction of the wave) Arrive at the seismometer after primary waves
Surface waves:
Love waves move back and forth (side to side)
Rayleigh waves move in an elliptical path
Further resources:
http://eqseis.geosc.psu.edu/~cammon/HTML/Classes/IntroQuakes/Notes/waves_and_interior.html
How do the various types of seismic waves propagate through the earth?
"In style" (Towry, 2016)
Refraction-based on density. This changes as the waves go through different materials. The Angle of Incidence and Angle of Refraction causes P and S waves to curve.There is a shadow zone where body waves are not detected due to refraction.
Reflection-the wave doesn't just refract, but it also reflects at the junction of the rocks. Then you have two waves propagating through the Earth.
Dispersion-Only applicable to the surface waves. Different periods travel at different velocities so the effects of dispersion become more apparent over distances. Cliff notes: Stronger waves travel further.
Further resources:
http://eqseis.geosc.psu.edu/~cammon/HTML/Classes/IntroQuakes/Notes/waves_and_interior.html
Why are seismic waves important to LIGO?
Because LIGO measures the "squishy and expansiony thingy" (Beam, 2016) of gravitational waves. In order to eliminate the background noise of seismic activity, LIGO must know what is going on in the ground in order to rule it out.
LIGO eLAB workshop at BHSU
Objectives
Participating teachers will be able to use the LIGO e-Lab to:
- Plot and interpret data recorded by LIGO seismic instruments
- Explain the importance of LIGO seismic data in gravitational wave search
- Identify and list classical physics concepts in LIGO data analysis
- Develop a plan for use of the LIGO e-Lab in the classroom.
Agenda
Times and specific activities are subject to adjustment.
Monday July 2508:00 Coffee, Registration 08:30 Introduction
9:00 Gravitational Waves presentation 10:00 Break 10:15 Interferometer activity 11:30 Look back/Look forward 12:00 Lunch 13:00 Exploration of LIGO e-Lab:
14:30 Break 14:45 Search and analyze in data:
16:30 End of Day
|
Tuesday July 2608:00 Coffee/Reflection
9:00 LIGO Hanford Virtual Visit 09:45 Break 10:00 Begin resreach
12:00 Lunch 13:00 Finish research/Create Posters 14:00 Break 14:15 Poster Presentation 14:45 Implementation discussion 15:15 Reflection 15:30 Evaluation 16:00 End of workshop |
Resources |
Contacts |
Inspiring Science Educators Summer Academy 2015 in Marathon, Greece
Inspiring Science Educators Summer Academy 2015 in Marathon, Greece
It is an amazing experience here in Greece! This week I am with teachers from the United States and across Europe to develop lesson plans around inquiry science. A respository of these types of scenarios called Go Labs is found at http://www.golabz.eu/. We have been introduced to a variety of eTools and resources to engage students in learning about science and in particular, physics and high energy physics. An interesting app I want to share allows you to use your cell phone as a cosmic ray muon detector: http://crayfis.io/ . The Institute's leaders, Dr. Eugenia Kypriotis, Dr. Angelos Lazoudis, and Dr. Rosa Doran, have organized an outstanding offering for science teachers. Thank you to QuarkNet for sponsoring this opportunity to collaborate with teachers throughout the world in beautiful Greece!
Summer Project: Creation of a Data Visualization Program using R
Project Goal- To build an interactive website to visualize data taken from the weather detectors located at the 4850 level (4 Winze Wye, 17 Ledge, and Governor’s Corner) of SURF.
Day 1- 6/22/15
I received the main project overview today, and began my research.
Somebody else figured out how to plot the data in an interactive way, using the programming language “R”. However, we still do not quite understand how they did it, and are attempting to replicate the results. I have decided to take an online course, similar to Codeacademy, that will explain “R” and its usages better, as I am still a novice.
Scripts are called using the “source” function. For instance, if one had a script entitled “CommenceWorldConquest”, one would simply use “source(“CommenceWorldConquest.R”)”.
The next section explained vectors, which are lists of values. Vectors can be strings, Boolean values, or numbers, as long as all the values are of the same type. For instance a vector cannot contain 2 numbers and a string. Vectors are created using the “c()” function.
Vectors can also be created using “start:end” notation, which creates a list of data from the first number listed to the last number listed. “6:8” would generate “6,7,8”. Another way to create these lists is to use the sequence function. The format for the sequence function is “seq(start,end,increment)”. For instance, “seq(6,8,0.5)” would create a sequence of 6 numbers 6-8 that would increase by (0.5) each time.
If you store a vector in a variable, you can retrieve individual elements of the vector. For instance, if vector “6,7,8” was stored in variable “Counting”, one could type “Counting[2]”, which would return 7. This is similar to Java’s method of calling individual parts of a string, but different in that its values start at 1 instead of 0. You can also change the value of individual elements in the vector through this method. For example, “Counting[3] <- 9” would make the third number be 9 instead of 8. In addition to changing the values, you also can add values to the vector. Entering “Counting[4]<-9” would add 9 as a fourth value to the vector. It is also possible to access multiple values. “Counting[c(1,3)]” would return “6,8”. One can also use the start:end notation. “Counting[1:3]” would return 6,7,8.
The next lesson explained how to name elements. One does this by using the names command. For instance, “names(counting)<-c(“six”, “seven”, “eight”) would name 6,7,8 as six,seven,eight. Names are practically variables.
One can do vector math using R. After giving your vector values, you can add, subtract, multiply, and divide with ease. For instance, counting+1 would return 7,8,9. Counting*2 would return 12,14,16. You can also add vectors to vectors. For instance, if I had a second vector “B” equal to (4,3,5), “counting+B” would return (10,10,13). You can also compare vectors by using comparator operators, which will check the values and return either a “TRUE” or a “FALSE” value. Vectors can also be used with trigonometric functions.
The Plot function creates a scatterplot of the x and y coordinates. X and Y can be substituted for any variable, just so long as there is a variable.
It is possible for a value to not be available, in which case you put it in as “NA”. Using a vector with a “NA” value in a function will return “NA” unless told otherwise, using the “na.rm” argument. One would enter that value like so: “sum(a, na.rm)”.
One can make a matrix in R by using the “matrix()” function. The formula for matrix input is matrix(fields, length, width). For instance, a matrix “matrix(b,5,6)” would use the values of vector b to fill a 5X6 matrix
The “dim” function assigns dimensions to a vector. For instance dim(b)<- c(3,6) would cause vector b to turn into a 3X6 matrix.
To get a value from a matrix, one would use the same method as getting a value from a vector, but use 2 fields instead of 1.
One can also use this method to reassign values.
You can retrieve the values in a column by omitting the column
You simply don’t put anything into the column. To retrieve all values in a column, omit the row instead.
You can also use the above method combined with the “start:stop” format to retrieve all the values in multiple rows or columns. For instance plank[,3:4] would return all values in columns 3-4.
One can use the matrix to create powerful visualizations. For instance, using the contour function, one can create a contour map of data.
The median function finds the middle value of a vector ordered from least to greatest. One can also find the standard deviation using the sd function. One can then plot these results on a graph to see what a “normal” result would be.
The factor function categorizes types of data. It will show the levels, which will divide all the data into categories. The as.integers function will show how much of each type there is, and the levels function shows the categories.
When plotting data, the legend function will take an area of the graph, a vector with label names, and vector with plot character IDS (pch). It is a good idea to use the levels function, which will prevent you from having to go in and change everything if you add 1 variable.
The data.frame function creates a data table not unlike an Excel Spreadsheet
Just like matrices, one can collect individual elements from the table. For instance, to select the second column in the above matrix, one would type treasure[[2]] (BE SURE TO USE DOUBLE BRACKETS!!). You can also use the string name of the column such as treasure[[“weights”]]. However, a shorthand notation is also used. It is the data-table name, a dollar sign, and then the column name WITHOUT quotes. For instance treasure$prices would generate the same answer as treasure[[“prices”]].
The read.csv function allows one to import a csv file into a data table. CSV stands for Comma Separated Values. For files that have different separators than commas, such as slashes, one uses the read.table command. Using the sep argument, you can define the character used for separating items.
The V1 and V2 columns are due to the fact that a header was not specified. One can specify a header by adding the argument “header=TRUE”, which makes V1 and V2 go away.
The Merge function allows one to have a table with multiple Y values sharing the same X value.
Day 5-8 (Learning R)- 6/26/2015-7/1/2015
These last few days have been spent learning the ins and outs of the R programming language. I tried multiple online tutorials, such as the free one offered by Udemy and a course on Youtube. After I felt comfortable enough with R, I began using the Shiny package, which is what Chuck’s program is built on. I worked through the tutorial provided by the developers of Rstudio, and a video webinar provided by the same people. These provided the basic information that I felt I needed to attempt to work with Chuck’s script.
The above images are screencaps from the output of Chuck’s script. It creates 3 graphs of data that the user inputs, and a data table with it. So, I figured the first step of attempting to modify Chuck’s script was to simply get it to run. This turned into a quite grueling ordeal. At first, when I tried to run it, I realized that I needed to change the filepath, as his data was stored in a different directory than mine was. After this, I thought the script would run fine. However, whenever I ran it, it would return the least helpful error message I have ever seen.
“Replacement has 0 Rows, Data has 34257”.
I did not know how to fix it. I isolated the error message down to one line, but everything I would change in there would simply spit out different error messages. The next day, I was ready for another struggle to get that to work, but I was able to figure it out rather quickly. I opened up the data file I was using, and removed various parts of the header, and I also removed the spaces in the header. This was the whole problem. I was able to get the script to run flawlessly. The next step was to try to modify the script to achieve our goals in visualization:
· Different colors for each graph
· Multiple data sets on one axis (For instance, being able to graph airflow and air pressure on the same graph)
· 6 different data sets
· 2 graphs.
I started trying to change the color. This proved much easier than I imagined, as I simply had to add a “colour” argument to a small area of Chuck’s script.
I did that for each output, with a different color. The next step I attempted was to make each graph a readable line. I did this using the geom_line() command. In Chuck’s original script, he used geom_point(), which we thought looked more disorganized than a line. I then copied the above script and changed the inputs to create 6 different data sets, but also inadvertently creating 6 different graphs.
I was successfully able to have a graph that created multiple y outputs on one plot.
It is definitely much different from the original format, which only allowed one Y plot for the X plot.
I then created a second merged graph, for analysis purposes. I also created a second set of inputs.
My next step will be to try and create a legend for the graphs, as it can be hard to differentiate between the various elements of them.
7/1-7/7 (Trying to Create a Legend)
This task was much more difficult than I anticipated, probably taking more time than any of the other individual components of the shiny app. Every single approach I tried ended in unhelpful error messages. My frantic searching on the internet proved to be unhelpful. The only useful fact I found was that ggplot2 (the R package required to graph our data) was supposed to be creating legends automatically. Finally, after a week of searching, trying everything I found, I finally was able to create a legend. It is quite rudimentary, but it should at least give me a baseline to try and get one that suits our purpose better. The code that finally generated the legend is below:
A Dynamic Legend (or How I Learned Programming Can Be Like Beating Your Head Against a Wall. Repeatedly.)
7/7-7/10
These last three days have proven to be quite stressful. I have yet to make any true progress from my last entry, except for an incredible amount of trial and error. The only true accomplishment I had was changing the line size, and even that was somewhat difficult. I have been trying to do two things:
1. Create a legend that changes dynamically based upon user inputs
2. Give the graphs multiple Y axes, for analysis purposes.
The second one seems to be completely unsupported by the developers of ggplot2, the package we have been using to create the graphs. This is due to the fact that it goes against their philosophy, which seems incredibly arbitrary in my opinion.
My first goal also seems to be a feature that nobody else wants to use, as I was unable to find any documentation about it during the long hours that I spent researching these issues.
Eventually, we both gave up on trying to solve this, and emailed Chuck, the author of the original code, to see if he has any ideas. He pretty much reiterated what we learned in the last few days: both of our final goals are practically impossible in ggplot2.
7/13/15
We have shifted focus away from the dynamic legend, and have started working on separate projects. I have started looking for a way to allow the data to be imported from a webserver. My first step was getting my data onto the internet, using Google Drive and Dropbox. I then tried to load it back into R from those websites, and failed miserably. I continuously received a “duplicate row names not allowed” error. I take back what I said earlier; this is by far the least helpful error message in existence. After a Sisyphean effort, I finally was successful in isolating the error. It was a problem in the URLs themselves, which was remedied by the usage of the repmis package. This allowed me to read the file from Dropbox.
^The code I used to read the file
However, I cannot input velocity values now, as it generates another vague, unhelpful error message. I’m starting to think that programmers actively conspire to make error messages intentionally impossible to understand.
I was able to give the data a cursory analysis, and there are a few interesting things that keep cropping up. For instance, the Governors Corner and 17 Ledge graphs are incredibly similar.
In conclusion, we were successfully able to create an interactive program for data analysis. During my time spent here, I learned how to:
· Use R
· Use ggplot2
· Use Shiny
As I was told by a professor at BHSU, many research opportunities require that one build their own equipment and create their own programs, as the required materials will likely not have been made yet. I believe this project was incredibly valuable in that aspect, as it introduced me to the concepts of computer programming.
Visualization of flow meter data collected at SURF future site of DUNE
The current project this summer is for the QuarkNet student to continue a data visualization web site that was started a summer ago by an intern that was working at the Sanford Underground Research Facility. Data is collected by three underground weather stations using Campbell Scientific equipment. The three weather stations are located at the 4850 level at SURF. The student (W Smith)is being exposed to the Campbell Scientific software that will allow three data sets to be combined into one. This data set is then imported into R studio and a script is being developed to visualize iteractivey with the data. Shiny is the reading the script and dynamically building a web page.
2014 Annual Report - BHSU
Black Hills State University
2014 Annual Report
QuarkNet Center Name, number of years in the program
Black Hills State University, Year 6
List of faculty/staff/student participants including the role each played
Kara Keeter: BHSU Faculty, physics mentor
Jaret Heise: Science Director, Sanford Laboratory, physics mentor
Students who were not part of the BHSU QuarkNet program but who interacted with the high school teachers during the week they were here:
Kristin Rath: undergraduate student, Black Hills State University
Erik Belsaas: undergraduate student, Black Hills State University
Alexander Kramer: undergraduate student, Dakota State University
Description of the teacher participants
Chad Ronish, Hill City H.S., Hill City, SD – Lead Teacher (from Summer 2009)
Rose Emanuel, Lead-Deadwood H.S., Lead, SD (from Summer 2010)
LuAnn Lindskov, Timber Lake H.S., Timber Lake, SD (from Summer 2010)
Mechelle Powers, Custer Middle School, Custer, SD (from Summer 2010)
Steve Gabriel, Spearfish H.S., Spearfish, SD (from Summer 2011)
Deirdre Peck, Aberdeen Central H.S., Aberdeen, SD (from Summer 2012)
Doug Scribner, Newcastle H.S., Newcastle, WY (from Spring 2013)
James Stith, Newcastle H.S., Newcastle, WY (from Summer 2013)
Zach Beam, Newcastle H.S., Newcastle, WY (from Spring 2014)
Zach Beam is a new teacher, and is enthusiastic about joining the other Newcastle teachers in QuarkNet. He is a recent BHSU graduate, and was a student of Keeter’s.
This summer Rose Emanuel left Lead-Deadwood and moved to Washington. We will miss her energy and contagious excitement. We hope her successor will join QuarkNet; in the meantime, Steve Gabriel is working with Sanford Underground Lab to deploy “Rose’s” CRMD onsite.
Description of the student participants
J. Ivy, Aberdeen Central H.S. (senior)
O. Smith, Spearfish H.S. (sophomore)
J. Wieland, Aberdeen Central H.S. (junior)
Activities for 2013/2014
LHC Masterclasses, 21 & 29 March 2014
The BHSU QuarkNet Center hosted two Masterclasses in March, 2014 at BHSU.
March 21, 2014:
Attended by students from Hill City (Teacher: Chad Ronish) and Newcastle, WY (Teacher: Zach Beam).
March 29, 2014:
Attended by students from Spearfish (Teacher: Steve Gabriel), Timber Lake (Teacher: LuAnn Lindskov), and Aberdeen (Teacher: Deirdre Peck).
Summer Teacher Institute, 24-27 June 2014
Seven teachers attended the BHSU QuarkNet Summer Teacher Institute the week of June 24-27, 2014:
Zach Beam
Rose Emanuel
Steve Gabriel
LuAnn Lindskov
Chad Ronish
Doug Scribner
Jim Stith
Deirdre Peck was unable to attend due to a prior commitment to help with the Davis-Bahcall Scholars Program. However, she did send her CRMD with the two students from Aberdeen (Jordan Ivy and John Wieland) who worked with her detector during the workshop.
Mechelle Powers was also unable to attend, but she sent her CRMD with Chad Ronish.
In all, six CRMDs were sent to the BHSU QuarkNet Center, and we had seven at the workshop.
Advanced CRMD Workshop Agenda
June 24-26, 2014
• QN & cosmic overview
• setup CRMDs (everyone should bring their detectors!); trouble-shoot if needed
• learn the new EQUIP java interface, and take data
• recalibrate all the counters (including pressure
• take data
• review the CR e-Lab
• upload data; learn the new blessing tools; bless any past data
• design two classroom activities: shower? azimuth? Altitude?
• short investigation & poster, if time permits
June 27, 2014
• Above-ground and underground tour of Sanford Lab, including the Majorana Demonstrator and LUX detector at the Davis Campus, and the copper electroforming lab at the Ross Campus; lunch is served underground.
Virtual QuarkNet Teacher Workshop, 22-24 July 2014
This summer BHSU was privileged to host the 2014 summer meeting of the Virtual QuarkNet Center. Eight teachers attended June 22-24, 2014:
Virtual QuarkNet Workshop Agenda
July 22, 2014
• Cosmic rays and CRMDs, including the new EQUIP interface
July 23, 2014
• Visit the Sanford Lab and learn all you ever wanted to know about neutrinos
• Attend lecture and round-table discussion with CETUP* scientists
July 24, 2014
• ILC tentative plans and simulated data
Summer Research Student Program, 9 June – 19 July 2014
For the second summer in a row, we were able to offer a six-week research program for high school students. Three students participated: J. Ivy (Aberdeen, senior); O. Smith (Spearfish, sophomore), and J. Wieland (Aberdeen, junior). Steve Gabriel was the mentor teacher. Steve led the students in expansion and optimization of a system for remote control of underground environmental monitors. Their work involved computer programming to control the equipment and web page development to monitor the equipment and data. Jordan Ivy, who was over 18 years old, was able to accompany Steve underground to deploy the instruments.
In addition, O. Smith and J. Wieland designed a feedback system for precisely controlling the temperature of a laser spectroscopy cell to be used in cavity ring-down studies for the DarkSide liquid argon dark matter detector in Gran Sasso, Italy. J. Ivy searched the CRMD eLab database investigating possible correlations between extreme weather conditions and muon flux.
Along with their research, the students attended lectures and informal discussions on nuclear and particle astrophysics with Kara Keeter, Brianna Mount, and Jaret Heise. They also attended several of the Davis-Bahcall and CETUP* lectures and events, including a round-table discussion with astrophysics theorists and experimentalists attending CETUP*. The activities culminated with the students joining the CETUP* participants for an all-day excursion to Mount Rushmore and the Crazy Horse Memorial which included lunch and a round-table discussion with scientists and Native American students at the Crazy Horse Memorial on July 19.
BHSU Abstract-Cosmic Ray and Weather Correlation Study
J. Ivy (Aberdeen Central High School)
Steve Gabriel (Spearfish High School) Dr. Kara Keeter (Black Hills State University)
The purpose of this study was to locate and isolate instances of coincidence between muon flux and major weather events over the last three to five years. We conducted this search by running flux studies on reliable cosmic ray data during the time of three major weather events. These events were the tornado outbreaks of May 2013, Hurricane Sandy, and the Black Hills blizzard of October 2013. For each of the events, we used the cosmic ray data of the Spearfish High School CRMD (Cosmic Ray Muon Detector), which has the most consistent data of any of the detectors, as a baseline. The outbreak event looked at data from the Spearfish, Arkansas City, KS, and the Fermilab detectors. On all three detectors there was an increase in flux during both periods of the outbreak, May 16-18th and 25-31st , and recorded a drop in events between the outbreaks. Because of the issue of also matching barometric pressure and the inconsistence of one of the detectors, we were not able to determine a correlation. The second event, Hurricane Sandy, looked mostly at a detector in Michigan. The muon flux in the data corresponded to fluctuations in barometric pressure, rising and falling at approximately the same rate. This also coincided with the landfall of Sandy. Due to the lack of data from other detectors, the Michigan one being the only one within 2,000+ miles, I was able to find coincidence, but correlation could not be determined. The Black Hills blizzard event focused on the flux data of the Spearfish detector, and a detector in the Lead-Deadwood area of South Dakota. There was no correlation in the data, and there were inconsistencies in the data that made determining any correlation nearly impossible without further investigation. Between all three of our studies, we could not find any correlation between the weather events and muon flux due to inconsistence of data, lack of other sources of data, and time constraint. At this time, further investigation would be required to confirm my findings or to find evidence of correlation.