We’ve conducted some initial analysis of our data from last Friday (more here). Our goal was to be able to better visualize our sample data along the stream corridor. As a geographer, I turn to a map as my first impulse, but in this case, it’s certainly not the only way of viewing our data. In particular, we were interested in comparing the values from each of our units and better visualizing trends on our metrics along the stream corridor.
So, from our magic GIS hat, we pulled out some dynamic segmentation tools to “linearize” the data we’ve collected. Put simply, we moved each data point to a stream center line as defined by USGS’s National Hydrography Dataset (NHD) yielding a distance along the stream. We could then plot our metrics, e.g. dissolved oxygen, versus distance along the stream in a conventional scatterplot allowing us to compare our two sample units and samples through time.
Here’s a visualization of the linearization of our data:
In ArcGIS, each data point (in red) was moved to the NHD stream center line (points in blue) if it was within 100 meters of the center line. In addition, each new blue point was given a distance along the stream stretch.
As of yet, these data are unfiltered. We haven’t removed known extraneous and/or invalid readings.
First, data from the Kinnickinnic
(which can be compared to the map here)
Note that the blip in temperature is due to Unit #2’s temperature element being removed from the water for a time. Temperature is almost the same for both units.
Dissolved oxygen is also very similar (thankfully, after quite a struggle) between the two units. The mess to the left in the graph is explained below.
Electrical conductivity also shows close correspondence between the two units except with fairly high values. We use a two point calibration at 84 uS and 1,413 uS, so this divergence outside the calibration range is not all that surprising.
Our pH values, while still exhibiting similar trends, show the greatest discrepancy between the two units that, disconcertingly, varies throughout the sample. We’ll be revisiting our calibration procedure for the pH probe to make sure we’re consistent with each unit.
For the Menomonee River
For the Menomonee, our units again performed similarly, with all but pH matching fairly closely.
Why the jagged or seemingly noisy segments? This is due to including both the paddle out and back along each segment on the same graph. So, for dissolved oxygen on the Menomonee for example, Unit #3 (in orange) returned significantly different values on the way out as compared to the way back in the 6,200 to 6,700 meter range. Mike, paddling Unit #3, took a different path on the way back in this segment and there appears to have been a significant cross-sectional change in the DO across the stream profile. Since we’re linearizing and combining both in- and out-paddles, our line graph bounces back and forth across these values.
Obviously, we still have a lot of explaining to do for each of these trends. We’ll leave that for another post for now.
Well, we’ve encountered our first big hurdle with our water quality mapping. We’ve been so focused on getting the technology connections to work (Arduino -> Phone app -> Server) and getting our kayak mount system to work, that we’ve neglected data quality a bit. We’ve been operating on the assumption that we don’t have any electronic inference issues in our device and that the accuracy specs from Atlas Scientific are true.
We found out that we’ve got a problem somewhere.
Before heading out to the Milwaukee River, our first “big” sampling destination, we headed up to the Burnt Village put-in on Highway N on the Bark River with both boats and a borrowed YSI probe to compare our numbers to. With everything mounted on the kayaks, we all got in the water and started comparing numbers. Our first disappointment was that the Atlas Scientific arrays didn’t match the YSI values at all — not even close. And, even more disappointing, the Atlas Scientific arrays didn’t match each other — even after having been calibrated using the same techniques and same solutions earlier that day.
Time to hit the drawing board…
GIS is intended to be utilized to provide solutions to spatial problems. Vehicle routing is a spatial problem, with garbage routing being a classic example. Recently, the studio undertook the project of assisting Milwaukee Department of Public Works in creating a network dataset to efficiently route their garbage trucks that service over 190,000 households. A project like this requires the consideration and coordination of many variables, most importantly the consideration of time, capacity, and efficiency.
My role was to analyze the current garbage pick-up routes by district, day of pick-up, truck capacity, and cart capacity. My initial findings showed that within each district there was not uniformity in routing by day, causing for inefficiencies in routing- creating the need for trucks to travel on the same streets multiple days of the work week.
Using Excel, I was able to break down the attribute data provided to us by MDPW and find trends. I found that the number of carts picked up per day, the pounds picked up per day, and the pounds per truck per day were not consistent despite the use of the same amount of trucks per day. I was then able to provide a recommendation of carts/day, pounds/day, and trucks/day.
We found that when conducting network analysis and creating a network dataset, it is important to analyze the system already in place to discover existing inefficiencies that should be corrected in order to produce an effective routing solution.
By Alvin Rentsch
Geocoding Addresses from an Excel Sheet
- Be sure to format the Excel sheet so that the address is broken up into separate columns.
- Address (house #, & direction), City, State,and Zip Code
- Once the addresses are in this format, you can add the sheet to ArcMap the same way you would add any other data.
- Once you have the sheet in ArcMap, right click on the file and select “geocode addresses”.
- Select an address locator to use from the list and select “OK”
- Be sure that your input fields match those in your Excel sheet, and save your output to a proper location.
- View the number of matched addresses to see how many were correctly matched.
- Unmatched addresses may be a result of improperly formatting the address in the table, or it may not be recognized as an address by ARC GIS.
How to use the Customer Prospecting Tool for Business Analyst
Customer prospecting allows you to locate regions with ideal demographic characteristics for targeting new customers. This tool will produce an output layer that will contain demographic data that you define in the Demographic Query dialog box. This layer will display only the criteria that is meets the requirements you set. For example, a new company could use this tool to help show them where their target market is located after they figure out their typical consumers.
1) Load a Business Analyst ArcMap session. Under the Business Analyst Desktop Message Center, click on the link that says business analyst.mxd under recent map documents. This mxd file will show up with all the geographies that you need (zip codes, county areas, state areas, etc.)
2) Click the Business Analyst drop down menu and select analysis
3) Click “create new analysis”
4) Click “market analysis”
5)Click “Customer Prospecting”
6) Choose a geography that you wanted your information displayed at (zip codes, county areas, state areas, etc.) Then click next.
7) Choose whether you want the information displayed at your current extent or if you want it to be displayed based on a layer. Then click next.
8) Choose whether you want to used a customer layer with the demographic information already in it or if you want to choose your demographics manually. Then click next.
9) If you chose the customer layer, choose the demographics you want shown on the map. If you chose the manual option, double click on a demographic you want and determine the floor and ceiling numbers. As you can see this is highly used in online trading such as binary option trading websites where you are able to trade using this when trading binary options as explained in this binaryoptionsexplained.com option website.. Also if you choose manual you need to decide whether you want to match all of that criteria, any of the criteria, or you can create a custom query. Then click next.
10) Here you need to name the analysis layer and then click finish.
How to Display Business Points using NAICS codes with Business Analyst Desktop
The NAICS code, North American Industry Classification System, “is the standard used by Federal statistical agencies in classifying business establishments for the purpose of collecting, analyzing, and publishing statistical data related to the U.S. business economy” (US Census Bureau). This data is accurate but somewhat incomplete due to changes that are not captured after the initial NACIS code is assigned. One issue is that if a business changes or adds on additional services, often they do not change or add to their original NAICS code classification. This means they would not show up under our search but it doesn’t mean they don’t exist.
1) Make sure your preferences are set up to save in the correct folder. (see training document called How to set up your preferences to save your files in Business Analyst in the U drive under the training folder)