Math Hardware versus software – Similarities & Differences with Casio

Students using technology as part of learning math is important because of the extension of learning that is possible, the visual connections, and explorations that become possible as a result of technology. The most common technology students use these days are their phones, tablets, computers, and of course, hand-held devices such as calculators. It all depends where you live, what schools you attend, what’s allowed or not allowed, and also what resources are actually available and understood by both teachers and students. From my own research, some schools/teachers have a multitude of resources, but most schools have limited options. And – even if there are many technology tools available, teachers tend to utilize the tool (s) they are most comfortable with, and that the majority of students have access to. Basically, it comes down to choosing a technology that is going to support the learning and that students and teachers can use relatively efficiently, so that time is not lost to ‘tool logistics’. Often times, again, based on my own research (dissertation), teachers choose tools that may NOT be the best choice for learning because they know how to use it over a much better, more appropriate tool, that they are unfamiliar with or uncomfortable with, so many times better technology tools go unused because of the ‘learning curve’.

What I wanted to use this post for today was to show how Casio has really recognized the ‘learning curve’ issue and tried to keep functionality consistent across handheld models and even in their software, providing intuitive steps and menu options right within the graphing menu itself that alleviate some of that ‘learning new tool functionality’ concerns that teachers and students often face when using technology. Our graphing calculators basically use the same steps, buttons, layout, even from the very basic ones (fx9750) (fx9860), to the more advanced ones (CG50), so if you know one, you know them all. And, even the new software, ClassPad.net, is built along the same lines, though obviously with more features and capabilities.  But there is no ‘searching for menus’ – relatively intuitive no matter the tool. Obviously, as you get into the newer models and then into the software, the functionality and options increase – we go from black-and-white displays to color, we go from intersection points on the graphing calculators to union/intersections on the software. But knowing how to use one tool makes transitioning easy, and if you had students with several different models of the handhelds, you could still be talking about the same steps and keystrokes.

The best way to compare and demo is to show you how to do the same thing on the different models. I’ve chosen to show graphing two inequalities, so that you can see, even on the older models, that shading and intersections occur. But also to show that as you progress into the newer and more powerful tools (i.e. memory capacity, color, larger screens, resolution, etc), allowing for more options and learning extensions.

Here are the two inequalities that are being graphed in each of these short GIF’s:

Each GIF below graphs the two inequalities and finds intersection points of the two graphs. The software extends that to allow for finding the Union and the Intersection of all points.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Be sure to check out the free software that does calculating, graphing, statistics and geometry: ClassPad.net.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Advertisements

New Year’s Resolutions – A Chance to Explore Some Statistics

As I was at the gym this morning, noticing the increase in people that were there, I got to thinking about New Year’s Resolutions. I personally dread the month of January at the gym because inevitably, it is a lot more crowded with all the ‘new memberships’ given as gifts over the holidays, and full of new people who have decided losing weight and getting in shape are on their to-do list for this new year. As someone who hits the gym regularly, this month at the beginning of the year is a bit frustrating because machines are taken, the parking lot is crowded, and my regular routine is often interrupted due to the increase in the number of people. I admire everyone’s new-found commitment and applaud the goal of getting in shape and being healthier – however, my anecdotal evidence over the past several years is that this commitment is short-lived for many.  By February, things tend to get back to normal because, sadly, many of our ‘new years resolution’ folks lose the commitment and stop showing up, allowing the rest of us to get back to our routines.

Which brings me back to my thoughts about New Year’s Resolutions (NYR).

From my own very unscientific observations at the gym, those that made NYR to get in shape, lose weight, etc. usually last about a month – and this is based solely on the increase in people during January, and then the slow decrease in people as the month progresses, to the return to the regular crowd by February (with, granted, a few new ‘regulars’ who stick it out). I wondered, as I was cycling, are there any statistics out there that actually show the follow-through on New Year’s Resolutions – i.e. what were the resolutions made at the beginning of the year, and what was the actual end result at the end of the year?

I was able to find statistics on the most popular NYR made last year (2018)  However, I couldn’t find any follow-up statistics to see how many people in the survey actually stuck to their resolutions, which is what I think would be interesting to explore.

 

 

 

 

 

 

 

 

 

 

 

 

 

I then found another source that listed the 10 most popular NYR’s made for this year (2019).  A lot of the same resolutions, though maybe different priority. Some different ones as well, which could be a factor of many things – i.e. the economy, the political climate, the source of the survey, who was surveyed, etc.

I am curious why there is no follow-up from those that conducted the surveys at the end of the year. It would be fascinating to see what the graphs look like at the end of the year compared to the beginning and why or why not some people dropped off their NYR and some stayed true.  I couldn’t find any ‘proof’ for claims such as “80% of all NYR’s fail by February“, though again, going back to my personal observations, I would agree with this claim. There are definitely a lot of articles about how to ‘keep’ your resolutions, and plenty on why people don’t stick to their resolutions, but no statistics that actually support this claim that I could find. But it would be nice to have some data or evidence that supports observations – which leads me to my final thought on a fun ‘real world’ statistical study that teachers might explore with their students for the remainder of this school year.

During this short week, where school has started up again but students tend to still be in vacation-mode, why not start a long-term study to see if we can get some statistical data about NYR’s? Have students in your class make a list of 3 NYR’s – so some goals they really plan/want to accomplish by the end of the school year. Better yet, pick a specific month and/or date (so May 30 for example). Then, compile the class data to create categories and percentages, similar to the charts above. (My guess is students will have some different things on their top 10 list, which would be interesting in itself). Have students keep a record of their progress towards their goals, and maybe on a monthly basis, do a quick survey on students progress/commitment to their NYR’s.  Then at the proposed deadline, do another survey on the success/failure to see who is still working on their goals and who is not. Obviously it is going to be self-reporting, but it would be interesting, as time goes on, to see who is staying committed, who is not, and more importantly, WHY they are not staying committed if that is the case. Do the class results verify that 80% drop off by February? Is there a common theme for those that do not follow-through on their NYR’s?

I wanted to share this as an idea for teachers who might have made their own NYR to be more creative in their math class. The only NYR I ever made each year was to try at least one new thing in my math classes every month – for me a pretty easy resolution to stick to. I would imagine many teachers do something similar. For those of you who have made NYR, good luck and Happy New Year!