My WebLink
|
Help
|
About
|
Sign Out
Home
Browse
Search
09-24-1997 Council Agenda
>
City Council Packets
>
1990-1999
>
1997
>
09-24-1997 Council Agenda
Metadata
Thumbnails
Annotations
Entry Properties
Last modified
3/11/2013 11:43:08 AM
Creation date
3/11/2013 11:39:29 AM
Metadata
There are no annotations on this page.
Document management portal powered by Laserfiche WebLink 9 © 1998-2015
Laserfiche.
All rights reserved.
/
101
PDF
Print
Pages to print
Enter page numbers and/or page ranges separated by commas. For example, 1,3,5-12.
After downloading, print the document using a PDF reader (e.g. Adobe Reader).
View images
View plain text
H E A L T H Y Y O U T H / H E A L T H Y C I T I E S <br />Measuring Success <br />On any given day, we are inundated with the difficulties facing our youth. <br />Often, the common response is to get tough. But do hard line tactics work <br />and, if not, how can cities evaluate and improve programs? <br />Minnesota communities realize the <br />situation facing youth first hand and are <br />addressing it through a number of <br />prevention strategies. Prevention <br />strategies can be grouped into three, <br />broad categories: primary prevention, <br />aimed at the general public; secondary <br />prevention, aimed at individuals exhib- <br />iting high risk characteristics; and tertiary pre- <br />vention, aimed at those individuals who have <br />already entered the criminal justice system. The <br />biggest question for all of these emerging tech- <br />niques is, "Do they work ?" Evaluation can help <br />answer this question. <br />Evaluation is as much a way of thinking <br />about how strategnes are implemented as it is a <br />set of handy tools and techniques for collecting <br />information. An evaluator always asks, "How <br />do I know what I am saying about my program <br />is true and will be accepted by others as true ?" <br />In setting up an evaluation of youth programs, <br />activities, and events, the goal is not to prove <br />that all youth have been saved from engaging in <br />all possible negative activities. Instead, the goal <br />IS to develop a chain of evidence that a reason- <br />able person would accept as demonstrating suc- <br />cess. <br />For example, research may demonstrate that <br />teaching independent living skills to homeless <br />youth reduces their likelihood of committing <br />crimes or becoming victims of crime. To effec- <br />. nvely evaluate such a program, the youth <br />ihould he able to demonstrate they have learned <br />living skills. The evaluation does not have to <br />demonstrate that youth who have completed <br />:he independent living skills program are less <br />involved in crime. Previous research has already <br />established that connection. Thus, a demonstra- <br />non of effective teaching and learning indicates <br />that the youth participating in the program are <br />also less vulnerable to crime. <br />As cities become more involved in youth <br />programming, as primary sponsors and collabo- <br />1 797 <br />By Edward C. Siegel <br />49 <br />43 <br />d'°, <br />Cities have varying degrees <br />of responsibility and involve- <br />ment with issues and con- <br />cerns affecting children and <br />families. A 1996 National <br />League of Cities survey <br />asked respondents to report <br />the three areas their city <br />was most connected to dur- <br />ing the previous year. <br />Page 79 <br />rators, conducting a thoughtful evaluation will <br />both measure success and determine areas need- <br />ing improvement. There are several simple <br />guidelines to consider when evaluating youth <br />programs: <br />Determine basic demographics. It will be im- <br />portant to figure out whether the program is <br />reaching the group of originally targeted youth. <br />Funders especially want to know who receives <br />services. <br />Get feedback. What did the participating <br />youth enjoy about the program? What could <br />have been done differently or better? What as- <br />pects made it difficult to participate; e.g., lack of <br />transportation, meeting times, lack of child <br />care? Was staff well prepared? Were other par- <br />ticipants eager and supportive? Finding out the <br />useful and not -so- useful components of a pro- <br />gram, event, or activity is vital information. <br />Good feedback data provides the clues needed <br />for improvement. <br />Outcome data is crucial. Although outcome <br />data may be far more difficult to obtain than <br />other types of data, it is by far the most impor- <br />tant. Reliable outcome data tells what was di- <br />rectly accomplished as a result of participating in <br />the program, activity, or event. Some examples <br />of outcome data include test results, records of <br />goal accomplishments, and school records. <br />Be objective. Always collect information, <br />feedback, and data in an objective format. This <br />means questions should have well defined <br />choices for answers, tests should be multiple <br />choice, and opinion items should have defined <br />scales. <br />Limit open ended comments. Written com- <br />ments are interesting, but for the most part they <br />should be supplemental to the evaluative data <br />rather than the main body of data. Open ended <br />comments are obviously more subjective and <br />difficult to measure. <br />Keep materials brief. Evaluation materials <br />should be brief and still cover the topic success- <br />
The URL can be used to link to this page
Your browser does not support the video tag.