Technology in the Classroom?

The Economist ran one of their occasional EdTech updates: to titles that caught my eye:

 

  • Summary bit:  Together, technology and teachers can revamp schools
    • copy of this article here
  • And Technology is transforming what happens when a child goes to school
    • copy of this article here

AI in Popular Press

Two popular recent articles on applications of machine learning:

  • Million-Dollar Prize Hints at How Machine Learning May Someday Spot Cancer (Will Knight, May 9, 2017)
    • a copy of this article is here
  •  Machine-learning promises to shake up large swathes of finance (The Economist, May 28, 2017)
    • a copy of the article is here

 


Simpler is Better

Some 130,000 users have made some level of effort to examine a “language learning” approach to recognizing words (originially developed at Teachers College while looking at ways to introduce Indo-European language speakers to Sino-Tibetan and other non-alphabetic languages), and over the last five years, I have come to realize that the original implementation of this idea added an unnecessary layer of complexity and probably, for most people, severely limited the appeal (or even entirely hid) the appeal of this notion.  Learning languages is sufficient challenging to most beginners — there was no need to add more complexity to the basic approach, at least at the outset. No need to create puzzles to make it more interesting (or just harder) to click on words, hear them pronounced, see the characters — and let them click as often as they like.

With new Touch mode, and native standard pronunciation.

So with version 5.x, this new language learning tool is much more useful to my own study of basic “Hanzi” (Chinese characters), as it starts in a simple “point-click-hear-see” mode that lets me randomly click any word and hear it in Chinese (or German, or French, etc) as often and as many times as I like.  I bet most people that try this will find it useful — and probably too useful to even wonder how to use the “review” or “play” modes (still available in the app) to begin testing their own retention.  Ah well, five years on, better late than never.  See http://AppStore.com/harrylayman.


Assessment in the Classroom

I am seeing more articles about different kinds of assessment — from performance based to multi-stage — and more about getting teachers more up to speed on the basic in assessment.  Not sure I understand or would prioritize this “new learning” for teachers for much of K12, or at least grades 7 to 12, where subject manger expertise in areas like STEM, or new offerings in STEM, critical thinking and problem solving (not to mention working in algorithms and data), and would get my vote for more teacher-focused action. But then, a basic grounding in the framework of “academic measurement”, without the full crush of statistics / psychometrics, but with the wisdom and practice that has evolved since at least the time of Alfred Binet, would be a valuable thing where it is missing… but don’t they teach that in “education schools” ?    cover of book on amazon

Cambridge Assessment’s blog has a piece on the thinking, about which I am still thinking:

http://www.cambridgeassessment.org.uk/blog/the-basics-of-assessment-for-new-teachers

Meanwhile, thinking of all things UK, I am slowly digesting the coming changes to A Levels — not so much the mostly noise around how to communicate “scoring changes” — but rather, how disappointing it was to see “critical thinking” on the list of discontinued A-Level exams.  I understand the advantage of fewer, better tests, but seeing room on the list going forward for  “ancient languages” and “classical civilisation”, ancient history, government, geology, design and technology, electronics, film studies, etc. and such, it seems a shame…  At least I was still able to buy “Thinking Skills” by John Butterworth and Geoff. Thwaites (us Amazon link).   Will work hard to fit a careful read of this into my schedule before too many weeks pass..

 


Machine Learning for Text in the News (again): Finance

A short but interesting piece in The Economist this week entitled Machine-learning promises to shake up large swathes of finance, under a heading of “Unshackled algorithms” (located here).

Many of the usual observations and platitudes are contained herein, but I thought these quotes were notable:

  • Natural-language processing, where AI-based systems are unleashed on text, is starting to have a big impact in document-heavy parts of finance. In June 2016 JPMorgan Chase deployed software that can sift through 12,000 commercial-loan contracts in seconds, compared with the 360,000 hours it used to take lawyers and loan officers to review the contracts. [So maybe once again I am focused on one of the least remunerative aspects of a new technology…]
  • Perhaps the newest frontier for machine-learning is in trading, where it is used both to crunch market data and to select and trade portfolios of securities. The quantitative-investment strategies division at Goldman Sachs uses language processing driven by machine-learning to go through thousands of analysts’ reports on companies. It compiles an aggregate “sentiment score” based on the balance of positive to negative words. [Seems a bit simplistic, no?]

  • In other fields, however, machine-learning has game-changing potential. There is no reason to expect finance to be different. According to Jonathan Masci of Quantenstein, a machine-learning fund manager, years of work on rules-based approaches in computer vision—telling a computer how to recognise a nose, say— were swiftly eclipsed in 2012 by machine-learning processes that allowed computers to “learn” what a nose looked like from perusing millions of nasal pin-ups. Similarly, says Mr Masci, a machine-learning algorithm ought to beat conventional trading strategies based on rules set by humans. [The data point replicates, over the same timeframe, when Elijah Mayfield showed that off-the-shelf, open source machine learning could with days of work produce competitive results (for scoring essays)  the capabilities of decades-old rule-based systems (from e-Rater, Intelligent Essay Assessor and six others). See note below]

 

I would also note that such “supervised learning” machine learning applications that leverage NLP )natural-language processing tools, which are used in, but not by themselves good examples of, IA techniques) tools are now a standard “first stage” of Machine Learning that typically evolves toward some form of neural network-based improves, just as the “computer vision” example noted above did in subsequent iterations over the last five + years.

Good stuff.

for the Elijah Mayfield reference see:

  • Mayfield, E., & Rosé, C. P. (2013). LightSIDE: Open Source Machine Learning for Text Accessible to Non-Experts. Invited chapter in the Handbook of Automated Essay Grading.
  • Shermis, M. D., & Hamner, B. (2012). Contrasting state-of-the-art automated scoring of essays: Analysis. Annual National Council on Measurement in Education Meeting, March 29, 2012, pg. 1-54.

What’s goin’ on?

It happens that What’s Going On is the eleventh studio album by soul musician Marvin Gaye, released May 21, 1971.

Forty-six years later, now more than ever, I have to ask:  What’s Goin’ on?

MarvinGayeWhat'sGoingOnalbumcover

 


Data Science Bowl 2017 – more AI for medicine and medical images

I was interested to read in the the piece in the MIT Technology Review,

Million-Dollar Prize Hints at How Machine Learning May Someday Spot Cancer

A million dollar prize certain grabbed some headlines, but the details of the winning solution – more image annotations (e.g. more trained doctors / technicians), plus partitioning the basic problem into a) finding nodules; and b) diagnosing cancer), are both clear signposts to the future. Indeed, the future of low-dose CT scans is certainly looking stronger.  And while progress with machine learning, medical imaging, and diagnostic medicine is not always linear (or straightforward, as we read here), 3D imagines that capture relative tissue density and other characteristics clearly provide a highly construct-relevant feature set that is making advances in this are steady and promising (editorial: in a way that other work (e.g. is this argument convincing?) relying on indirect features and characteristics (computational linguistics in this case) is not yet keeping up…).

Since Google’s acquisition of the Kaggle, I have not taken a new look at the Google tool set for creating deep learning networks, but the promise of introducing a “semantic data layer” based on a semantic grammar approach to rubric construction might offer a promising path to better machine understanding of text and speech.