First off our instructor really went the extra mile with the lecture. It was a full 156 minutes with two significant demo sessions. Our lecture slides were full of good "do able" information. If one compared the Exercises there was significant realistic get it accomplished information. Let's give Theron a hugh Thanks!
For this newbie at least some of it is beginning to make sense. Truly it is comparable to learning a new language. At least I'm not as lost as I was and can find the door better than the bear in Assignment 4c, that was a crazy insanity python loop joke, I think.
Some tips of significance.
a) Save your work if you can.
b) Print out windows of the Idle to learn how you did that cause many times, it gets lost in
translation. By doing the printout, I began to see my common typing mistakes which
result in simple time eating syntax errors. Also I began to see in print how to correct
other more significant issues.
c) Type slower, get a pair of better glasses. The . and , look the same late at night but are
not the same to the computer.
d) CaseSensitivity. Ugh this program just can not suck it up and get along, it must have its
way specifically. A diva in disguise.
e) Looking help up on the internet can be a challenge. Learning what to call it that I'm not
figuring out it a stretch.
f) Usually when I think I"m the only one, there is someone else out there who is also too
embarrassed to ask because it seems so small an issue which is eating me alive (the snake
has a big mouth).
g) I'm not a quitter, I'm determined and will eat anything alive to figure it out. So, don't take
anything personal, it's just the learning process here.
Other tips with the program.
1- Indentation, it matters.
2- Case Sensitive , said this already.
3- One tenny tiny thing will cause it not to "go" in my language.
4- Running one line at a time in a script is a good plan to catch mistakes, correct in the dual
script window and correct at the same time.
5- Naming files & scripts clearer than data to what it did or accomplished, easier to find later.
6-Using a dual screen with pywin for scripts on one side and Idle on other for running line by
line is a good combination.
7-Entering text for a long passage is a absolute pain, there must be a better way. Will find it soon.
8- If, elif else are tricky. Need indentation, can indent in the indent. Are more like the old flow charts if yes go there, if no go here, but, there is always one. They need a way out otherwise it becomes a crazy loop thing that keeps going. Like the computer at the fair that must have had a plus 1 program that was up to 205,000,000 and going.
9- One always has to tell the program what something is for it to work. Much like husbands, they are not able to read minds and the wives who think it in their head and do not get it out of their mouth to the husband who did not do it cause he did not know that he was suppose to because he did not get the memo that was not spoken by the wife who expected the husband to know what she wanted.
Saturday, June 30, 2012
Thursday, June 21, 2012
Law Enforcement and GIS
Here's a review of an article in a text book which shows the relationship between GIS and law enforcement. The North Carolina, High Point police department was highlighted for its intervention program. Kernal Density is the primary analysis technique for the project.
When High Point, North Carolina law enforcement
chose to use GIS to implement a data driven focused deterrence model for their
drug related violence, they did not expect it to become a philosophy for their
departments. Further post analysis revealed a 31% decrease
in drug offenses and 37 % reduction in violent crimes. Community attitudes toward law enforcement
have improved significantly through a post study of the 911 calls. Calls increased and changed in their nature
from drug related issues to quality of life calls. Post intervention kernel density studies were
conducted using the same selection
Participation Article Review- GIS & Law Enforcement
Review by Karen F. Mathews.
GIS Applications, 5100 L
Article: Chapter 24,
Using GIS to Identify Drug Markets and Reduce Drug Related Violence. Article was presented as a Summary of an award given for the 2006
Herman Goldstein Award for Problem Oriented Policing. Subtitle: A Data-Driven Strategy to
Implement a Focused Deterrence Model and Understand the Elements of Drug
Markets.
Author: Eleazer D.
Hunt, Marty Sumner, Thomas J. Scholten
and James M. Frabutt
Law enforcement officials in High Point, North Carolina
sought to break the cycle of drug related crimes and the violence connected to
drug dealing in their small mountain community. A problem found nationwide in largest of
cities and the smallest towns. The
accepted modius operandi to address the drug problem nationwide had been the combined
methods of surveillance, undercover drug buys and routine mass drug sweeps by
law enforcement. Instead of lowering the
presence of drug activity these standard procedures over time have increased
community suspicion and distrust for law enforcement officials in general. High Point’s goal was to use a new approach
to an old problem. Breaking the cycle
of police distrust, drug activity and creating a continued community ownership
model of expected neighborhood behavior is the nexus of the data-driven focused
deterrence method described in this article. The
objective is using an intervention deterrence program instead of a punishment
model. However, it is the essential GIS
techniques which were used to develop an objective data set for the target
intervention zone that made the project “doable”, successful and later replicatible.
To do an intervention, the High Point department needed to
find the most suitable neighborhood zone and turned to GIS to assist with the
task. A decision was made to use a
single prior year as the source of data collection. Data consisted of 911 calls, police reports, drug arrest, field
contacts and specific drug related crimes categorized as serious. GIS changed the data use approach from a
purely police perspective “Where are the
drugs?, Let’s go there;” to a workflow question: “Where are there densities of
violent, sex or weapons crimes spatially concurrent with drug sales?” (pg 398).
Kernel density maps made on each layer
were used as the primary analysis technique.
As described in the article, several initial hypothesis were changed by
the use of GIS in selecting the intervention neighborhood:
1) False hot spots
were identified as default report locations;
2) most serious crime arrests were not associated with drug
activity;
3) the housing complex wasn’t the best place to have an
intervention; and
4) a smaller number of individuals were directly
involved in the sale of drugs.
Selecting the exact zone required cooperative efforts
between GIS analysis and selected police activity to locate the particular
community and persons to be approached with the intervention techniques. GIS was able through spatial analysis to
develop a visual spatial structure of a drug market. An intervention site was chosen in the West
End neighborhood and carried out.
Wednesday, June 20, 2012
Learning how not to get choked by the snake
So far, I'm not doing so good with Programming. It's taking longer to learn.
I've learned Job has more patience than I do.
Typing exactly and precisely without making andy mistakes is important. (That one was on purpose
to make a point).
There are geeks who are wired to understand. I am not one.
It's now too late in the night to write or understand anything else without being rude. I did not
complete the assignment. I'm sad that I ran out of time to understand. I'll be penalized for
being incomplete. At least I was able to get one little script to run right.
Therefore, when it's hard to say anything nice, it's better to be thought wise by
closing thy mouth.
I've learned Job has more patience than I do.
Typing exactly and precisely without making andy mistakes is important. (That one was on purpose
to make a point).
There are geeks who are wired to understand. I am not one.
It's now too late in the night to write or understand anything else without being rude. I did not
complete the assignment. I'm sad that I ran out of time to understand. I'll be penalized for
being incomplete. At least I was able to get one little script to run right.
Therefore, when it's hard to say anything nice, it's better to be thought wise by
closing thy mouth.
Thursday, June 14, 2012
Big Waves, Create Big Problems- The Tsunami Fallout
Wow! Now we are able to actually apply our skill learning to real scenario's. The Tsunami of March, 2011 had many aspects. One not expected was the vast expanse of radiation upon the people and landscape. We analyzed the radiation zones and population affected. This is only a small segment of a much broader analysis.
Had a connection glitch with the computer, so I'm posting the Process Summary part here to insure it gets included. (Gave it a numbered list to put the body of my text on the page). erg. Creating these blogs is not always smooth.
Had a connection glitch with the computer, so I'm posting the Process Summary part here to insure it gets included. (Gave it a numbered list to put the body of my text on the page). erg. Creating these blogs is not always smooth.
- Part I: Building a File Geodatabase
- 1) Reviewed data files in Arcatalog, see files will fall in natural groups. Following steps of Ex1.2, created (on file folder (tsunamiGdb)>new> new geodatabase Tsunami.gdb. .
- 2) created new feature dataset inside of Tsunami.gdb – transportation. Followed steps, added both shapefiles Rails to gdb without difficulty. Followed steps, was successful adding roads in same manner.
- 3) Added airfields.shp to Transportation as single feature class. Ok.
- 4) In new map. Saved in S:\GISAPPS\Tsunami\TsunamiGdb\kfm_Tsunami.mxd. Before did anything else, set environments for map. Used tsunamiGdb as base workspace file and base scratch space file. Set projected coordinate system as WGs_1984_Utm 54N.prj
- Converted Fusushima.xls to shapefile by File>add Data>Add xY data>chose fusuchima file> fields for xy populated>set coordinating system >projected coordinating system>Utm>WGS 1984>northern hemisphere>WGS_1984_Utm54N.prj ok little green diamonds on map. Will add other parts to map later.
- 5) Created new raster dataset following steps outlined. While in arcatalog>opened arctoolbox>
- Data management tools>raster>raster properties>build raster attribute table. Input a raster file ex: ASTGTM2_N37E140_dem.tif. accepted defaults for now. Same for each raster file in each set ie N37 the N38 (SendDem)
- Created new dataset for raster in list. In arcatalog>new>Raster dataset> typed label FukuDEM> here we used Pixtel type 16_Bit_unsigned for each>set same projected coordinate system>
- Did this for each raster file.
- On listing for each raster dataset in catalog, used Load>load data>input two raster files into each listing. (How many depends on ability of server. Here two is safe number). Did same again for SendDEM. Lastly, clicked on raster dataset in listing>calculate statistics, don’t rush computer, opened file wow. A pic. Whew.
- 6) Set up rest of catalog list for remaining database files. Geography-rivers, lakes. Added JapanAreas to Damage assessment.
- Screenshot for deliverable.
- Part II: Fukushima Radiation Exposure Zones
- 1) Opened Map, saw need for data repair. Located all shapefiles to be originated out of new Tsunami.gdb. Set environments with arctoolbox, set workspace as folder Tsunamigdb and scratchspace as same. Set output coordinate system as Wgs 1984 UTM 54N.prj
- 2) Set processing extent as JpnBndOutline_UTM
- 3) Renamed map kfm_Tsunami.mxd to preserve original map integrity. Added layers:
- JpnBndOutline, NuclearPwrPlnt, NEJapanPrefectures (JapanCities already on map).
- 4) Reviewed Attribute tables of layers, changed appearance of Japan Boundary to null with 1.0 outline. Changed labels on NEJpnPrefectures to Name_1 category.
- 5) Created separate layer for Fukushima PPt as only attribute.
- 6) Created multi-ring buffer. Using arctool>analysis tools>proximity>multiple ring buffer (script)>
- Had major display issues for Evac Zone multi-ring buffer. Could not see it on map, even though it is showing up on the TOC. Viewing map at 1;750,000 I should be able to see the buffer zone. At 1:100,000 no Evac Zone rings on map.
- Viewed discussion issues. Started over with new map- before doing anything – set environments with arctools- workspace S:\...Tsunami.gdb (geodatabase) scratch space same. Set output coordinates as specified below as : our projected WGS 1984 UTM 54N; set processing extent as JpnbndOutline_utm. Added data a) fukushima_ppt; fuku evac zone (multiring buffer) then zoomed to buffer- scale 1:475 The ring was front & center on page1 Now can not see anything else in background. Using fixed zoom out gently out to 100,000. Ring still there. Ok Added new data back in:
- 7) Alright had to get my airport layer fixed. It was in a little pile on the far south end of my map at full extent. Went to far to set symbol at 50 to find in on my map. Reading discussions tried Andrea Hinz suggestion. In original shapefile in basedata file, cleared all coordinate information. Reset it with geographic coordinate system WGS 1984. Next reimported into Tsunami.gdb/transportation as Airfields3. Worked like charm.
- 8) After airfield fun, I’ve realized I entered the fields wrong on the Evac-Zones. Back to drawingboard on it. Now basefile named Fukshima_ppt_MB2. (clip.
- 9) So Frustrated. Stopped.
- 10) . It was too late in the night for me to join back in on discussion board after 11pm 6/12. Before I joined back Discussion board reflects another student having same select by location issues with his screen shot dated 6/12 jpeg3 showing exactly my issues. . Instructor has graciously stepped out to “update base files” from R: drive since many students were having a variety of data issues. Taking these issues into account, I’ve chosen just to start the whole exercise again from scratch.
- a) Downloaded revised data from R: to S:- name new workspace file : Tsunami_redo, geodatabase workspace file REDO_Tsunami.gdb, following steps creating database files again. Will not bother to describe same steps as above unless any issues to report.
- b) Created second data frame at suggestion by fellow student to work with selection by location. Three base feature classes are 1)FusushimaNUK 2) CityPopData (from .xls) & Fuku_Evac_ZoneBuff. Could not see city dots, did full extent found them in a pile off the map. Definitely a coordinate system issue. Back into Arcatalog for a fix. Deleted citypopdata out of REDO_Tsunami.gdb. Back in citypopdata.shp in Tsunamigdb. Cleared coordinate system out of file, reentered as geographic Coordinate system, GCS wgs 1984, Now exported file into REDO_Tsunami.gdb as feature class file with the present coordinate system WGS 1984 UTM 54N. Preview map, there they are the dots are just fine.
- 11) Looks like I’m back on track. With just 3 layers, in 10-b above Select by location, whew only cities inside evac_zones are blue. Alright!! High 5 to me. & classmates figuring it out.
- 12) Pressing through balance of Map design. Worked on adjusting visual effects of DEM on to map in relation to evacuation zones a little challenging with the additional layers. Otherwise, map eventually produced.
- 13) Was pleasantly surprised how efficiently last map came together, actually redid entire lab both parts with fewer issues ( had coordinate issue, quickly resolved) within 5 hours). Reflects improved confidence in ability & increase comprehension of tasks.
Saturday, June 9, 2012
Here's an easy Button
The sceniero is to create an easier method for the employees to do, and do and do one of their most favorite tasks "calculate area" The solution, a python script was made into a tool on the desktop accessible toolbar. One can download this into their personal toolbar for use at anytime. Here's what it looks like. With your microscope you can see the little blue scroll appearing tool on the bottom of my tool bar sitting all by itself.
Research and development will continue to be underway for additional time saving measures.
(Source: gispro:Mod2@asgn2d).
Research and development will continue to be underway for additional time saving measures.
(Source: gispro:Mod2@asgn2d).
Subscribe to:
Posts (Atom)