Monday, September 28, 2015

A Musing on Technology and Teaching


This will be a brief entry, as CCE is visiting with us for the next three days. But I was thinking about technology, and how it applies in the classroom and clinical settings, for educational purposes. It is clear that today’s students learn in ways somewhat alien to those of us who are older, who learned via the traditional teaching method of lecture and lab. I grew up in that world, and am comfortable with taking notes, reading books and studying hard. Our students are not.
This was made clear to me in a couple of ways. The first is that when I came to class last Thursday morning, my early students (there before class was to begin) were, without exception, sitting in class looking at either their tablet or their smartphone. They were not taking to each other; rather, they were reading something, either a text message or a webpage of some sort. And most of our students now come to class armed with such technology, and as a result they are now less likely to interact in class. This is now how they communicate. Think about it- have you ever received a text message from someone in your office suite who could have easily walked 20 feet to your office? Happens to me daily- text messages are the way people communicate now- and I find it impersonal and a bit passive aggressive, to be honest.

The second thing is that we no longer need to remember information. Google has become our collective memory; whatever piece of information you need can be found there. Of course, once we find a piece of information, we need to verify that it is true. If nothing else, the information posted on Facebook is often twisted and wrong, yet is posted by people who believe it is true because it fits in with some sort of preconceived belief- political, scientific- they already hold. Thank goodness for websites such as snopes. Com, which often deconstructs or exposes the falsehood.

But that is why critical thinking is so important. That is the skill we need to focus on, training our students to think critically about the information they find. I think we do this fairly well, and we do so by yoking technology, such as Brightspace, to our abilities as teachers and clinicians.

Monday, September 21, 2015

Brightspace: First News Item


Best Practice: First News Item

The folks at Brightspace were kind enough to provide us with a Best Practices users’ guide, from which the below appears on page 17 (The Basics: D2LBrightspace Mechanics and Best Practices. Normandale Community College, 2015:17)

First News Item

The first thing new students will see is your Course Homepage, and the first News item. Among other things, that first News item should direct students specifically to a “Read Me First” file in the Content area. Providing a link in the first News item to it is also good – it avoids the phrase, Go to the Materials tab in the upper banner, click on Content, and then click on “Read Me First”, or something like that. This is fine, but a link is a quicker means to the goal of providing definite, easy directions. The first News item should contain, AT A MINIMUM:
 
·        Welcome
·        Instructor Name, Course Name, Formal Course Number
·        Instructor Contact Methods and Hours (phone, email, and office location. State when you will/will not be available).
·        Instructions to “Start Here” or “Read me First”
·        Online office hours (provided via group chat) one hour/week/three credit course.
·        Show some personality! Enter your credentials, a photo, teaching philosophy, etc.
·        The classroom delivered courses should have a notice that the syllabus is posted and other directions as to how you intend to use the course.

 Ongoing News Items

·        Current / newest News item goes on top, keep old ones below for reference.
·        Always repeat contact info, office hours in current News Item (copy and paste), unless you have an Instructor Contact widget on your homepage.
·        Keep News items short, bulleted, include a graphic, a link, etc. Students won’t read long items.
·        Try to change News items each week. More frequently is even better! Use it to have a conversation with your students like you would in class.
·        If you use the News area to direct students to assignments for the week, do it for each week of the course, to be consistent. Students will grow to depend on it.
·        Make them relevant – insert links to current “real” news items pertaining to your course, or a video.
·        Some people use the news area to display the content. Links that direct them to the content from the News is an option often used. In the Content area, you may create lesson modules that have links to all your content.

 

Monday, September 14, 2015

How to Set Up Your Grade Book in Brightspace

This is reprinted from Brightspace documentation, located at https://community.brightspace.com/resources/documentation/how_to/howtosetupyourgradebook

 
Setting up the grade book is one of the first things you should do when preparing your course.

First Steps
·         Consult your course syllabus for a list of all assessment items

·         Decide on a grading system

·         Determine if your institution uses a grading scheme, and if you will need to recreate it for your grade book

·         Decide how you want to group grade items, and what weight those groups should have

·         Decide how you want to calculate your final grades

·         The Grades Setup Wizard

Use the Grades Setup Wizard to set up your grade book for the course. When you access the Grades tool, the Grades Setup Wizard displays by default until you set up your grade book. The Grades Setup Wizard takes you through a 7 step process to set up your gradebook.
1.       Select a Grading System. You can select Weighted Points, or Formula. For more information on selecting a grading system, see the one-paged “How to select a grading system for your grade book.”

2.       Determine which type of final grade calculation you want to release to students. You can select either Calculated Final Grade or Adjusted Final Grade.

3.       Determine how much you want to treat ungraded items. You can select either Drop ungraded items or Treat ungraded items as 0. You can also keep users’ final grades up to date automatically by selecting Automatically keep final grade updated.

4.       Choose a default grading scheme. Organization Schemes are provided by your institution, or you can create a custom Course Scheme. See “Creating grade schemes for more information.

5.       Control how many decimal points display for items in the grade book.

6.       Control what your student see when they look at their grade books, including Grade Details, Decimals Displayed, Characters Displayed and Final Grade Calculation

7.       Review and finalize grade book set up decision. Click Finish to finalize your decisions or Go Back to make changes.

The link at page top will bring you to the page where this appears. There is some additional information available there that may help give you additional guidance, and you will also find links to the “Creating grade schemes” page.

 

Tuesday, September 8, 2015

New Papers from Biomed Central

Khoiriyah U, Roberts C, Jorm C, Ven der Vleuten CPM. Enhancing students’ learning in problem based learning: validation of a self-assessment scale for active learning and critical thinking. BMC Med Educ 2015, 15:140  doi:10.1186/s12909-015-0422-2

ABSTRACT
Background: Problem based learning (PBL) is a powerful learning activity but fidelity to intended models may slip and student engagement wane, negatively impacting learning processes, and outcomes. One potential solution to solve this degradation is by encouraging self-assessment in the PBL tutorial. Self-assessment is a central component of the self-regulation of student learning behaviours. There are few measures to investigate self-assessment relevant to PBL processes. We developed a Self-assessment Scale on Active Learning and Critical Thinking (SSACT) to address this gap. We wished to demonstrated evidence of its validity in the context of PBL by exploring its internal structure.

Methods: We used a mixed methods approach to scale development. We developed scale items from a qualitative investigation, literature review, and consideration of previous existing tools used for study of the PBL process. Expert review panels evaluated its content; a process of validation subsequently reduced the pool of items. We used structural equation modelling to undertake a confirmatory factor analysis (CFA) of the SSACT and coefficient alpha.
Results: The 14 item SSACT consisted of two domains “active learning” and “critical thinking.” The factorial validity of SSACT was evidenced by all items loading significantly on their expected factors, a good model fit for the data, and good stability across two independent samples. Each subscale had good internal reliability (>0.8) and strongly correlated with each other.

Conclusions: The SSACT has sufficient evidence of its validity to support its use in the PBL process to encourage students to self-assess. The implementation of the SSACT may assist students to improve the quality of their learning in achieving PBL goals such as critical thinking and self-directed learning.


Tzeng DS, Wu YC, Hsu JY. Latent variable modeling and its implications for institutional review board review: variables that delay the reviewing process. BMC Med Ethics 2015, 16:57  doi:10.1186/s12910-015-0050-8

ABSTRACT
Background: To investigate the factors related to approval after review by an Institutional Review Board (IRB), the structure equation model was used to analyze the latent variables ‘investigators’, ‘vulnerability’ and ‘review process’ for 221 proposals submitted to our IRB.

Methods: The vulnerability factor included vulnerable cases, and studies that involved drug tests and genetic analyses. The principal investigator (PI) factor included the license level of the PI and whether they belonged to our institution. The review factor included administration time, total review time, and revision frequency. The revision frequency and total review time influenced the efficiency of review.
Results: The latent variable of reviewing was the most important factor mediating the PIs and vulnerability to IRB review approval. The local PIs moderated with genetic study and revision frequency had an impact on the review process and mediated non-approval.

Conclusions: Better guidance of the investigators and reviewers might improve the efficiency with which IRBs function.

Chapman PD, Stomski NJ, Losco B, Walker BF. The simulated early learning of cervical spine manipulation technique utilising mannequins. Chiropr Man Ther 2015, 23:23  doi:10.1186/s12998-015-0067-6
ABSTRACT

Background: Trivial pain or minor soreness commonly follows neck manipulation and has been estimated at one in three treatments. In addition, rare catastrophic events can occur. Some of these incidents have been ascribed to poor technique where the neck is rotated too far. The aims of this study were to design an instrument to measure competency of neck manipulation in beginning students when using a simulation mannequin, and then examine the suitability of using a simulation mannequin to teach the early psychomotor skills for neck chiropractic manipulative therapy.
Methods: We developed an initial set of questionnaire items and then used an expert panel to assess an instrument for neck manipulation competency among chiropractic students. The study sample comprised all 41 fourth year 2014 chiropractic students at Murdoch University. Students were randomly allocated into either a usual learning or mannequin group. All participants crossed over to undertake the alternative learning method after four weeks. A chi-square test was used to examine differences between groups in the proportion of students achieving an overall pass mark at baseline, four weeks, and eight weeks.

Results: This study was conducted between January and March 2014. We successfully developed an instrument of measurement to assess neck manipulation competency in chiropractic students. We then randomised 41 participants to first undertake either “usual learning” (n = 19) or “mannequin learning” (n = 22) for early neck manipulation training. There were no significant differences between groups in the overall pass rate at baseline (χ 2  = 0.10, p = 0.75), four weeks (χ 2  = 0.40, p = 0.53), and eight weeks (χ 2  = 0.07, p = 0.79).
Conclusions: This study demonstrates that the use of a mannequin does not affect the manipulation competency grades of early learning students at short term follow up. Our findings have potentially important safety implications as the results indicate that students could initially gain competence in neck manipulation by using mannequins before proceeding to perform neck manipulation on each other.