March 3, 2014
Usually when money and students are mentioned in headlines, it is about student debt and the rising cost of tuition. Oracle has a more positive headline about this topic: “Metropolitan State University Reduces Students’ Nonpayments From US$4 Million To US$700,000 Per Year.”
Metropolitan State University is based in Minnesota. The college was experiencing a $4 million loss in students not paying their tuition. The solution was to deploy Oracle RightNow to improve communication channels with students and establish a student-relationship system to keep track of conversations.
After deploying Oracle RightNow, the immediate problems were resolved. It provided a centralized system that sent quick and individualized responses, improved efficiency, reduced application tracking, and most importantly send out trigger messages to students reducing student nonpayments from $4 million to $700,000 a year.
Metro State selected Oracle, because:
“ ‘We chose Oracle RightNow for its extensive reporting and analytics capabilities, which are far better than any other higher education customer relationship management tool on the market. Having the ability to easily put rich data in the hands of our advisors has really propelled us to the next level,’ said Andrew Melendres, vice president, student affairs and enrollment management, Metropolitan State University.”
Universities are slashing budgets left and right. Gaining several million in revenue from unpaid student tuition boosted Metro State’s budget and made them an example for other schools. We would expect that Harvard, Yale, and Stanford will follow Metro State’s lead.
February 18, 2014
Yay, free books! We love pointing out free resources, and now two textbooks relevant to content processing are available (in embedded PDF form) without charge via Yumpu.com. One, Information Technology for Management, from Turban, McLean, and Wetherbe, aims to bridge a crucial knowledge gap that plagues many businesses. It begins its task of educating managers in the mysteries of IT with a chapter titled, “Strategic Use of Information Technology in the Digital Economy.” As one might expect from a textbook, each chapter includes discussion questions and exercises. The authors illustrate their points with real-life examples from recognizable companies.
Another book, Data Mining Methods and Models, by Daniel T. Larose at Central Connecticut State University, tackles the building of data models. Like the above book, questions and exercises are provided for your enjoyment. This is the second book in a series. Its preface specifies that this volume:
“…explores the process of data mining from the point of view of model building: the development of complex and powerful predictive models that can deliver actionable results for a wide range of business and research problems…. [It provides]
*Models and techniques to uncover hidden nuggets of information
*Insight into how the data mining algorithms really work
*Experience of actually performing data mining on large data sets.”
If either of these sound like they could be of use to you (or someone you’ve been trying to explain these things to), don’t miss out on these free resources.
Cynthia Murrell, February 18, 2014
February 11, 2014
The Probability and Statistics Cookbook from Matthias Vallentin is a free statistics text. The creator, Vallentin, is a doctoral student at UC Berkeley who works with Vern Paxson in his studies of computer science. While there Vallentin has worked as a teaching assistant in undergraduate computer security course. Vallentin also works for the International Computer Science Institute. His research in network intrusion and network forensics began in his undergraduate career in Germany. The “cookbook” is explained in the article,
“The cookbook aims to be language agnostic and factors out its textual elements into a separate dictionary. It is thus possible to translate the entire cookbook without needing to change the core LaTeX source and simply providing a new dictionary file. Please see the github repository for details. The current translation setup is heavily geared to Roman languages, as this was the easiest way to begin with. Feel free to make the necessary changes to the LaTeX source to relax this constraint.”
The overview provides screenshots that make it clear the cookbook is more interested in the mathematical crux rather than elaborate clarifications. The author is open to pull requests in order to lengthen the cookbook, but in the meanwhile the LaTeX source code can be found on github.
Chelsea Kerwin, February 11, 2014
January 31, 2014
“Are Gifted Children Getting Lost In The Shuffle?” is a question asked by David Lubinski when he conducted a study that tracked 320 gifted children from age 13 until 38. News At Vanderbilt details how 203 of the children earned a master’s degree or higher, of which 142 earned a doctorate. Most of the children had successful careers, and went on to become doctors, attorneys, software engineers, Fortune 500 senior leaders, and even an advisor to the US President.
The article explains:
“Despite their remarkable success, researchers concluded that the profoundly gifted students had experienced roadblocks along the way that at times prevented them from achieving their full potential. Typical school settings were often unable to accommodate the rapid rate at which they learned and digested complex material. When students entered elementary and high school classrooms on day one having already mastered the course material, teachers often shifted focus away from them to those struggling with the coursework. This resulted in missed learning opportunities, frustration and underachievement, particularly for the exceptionally talented, the researchers suggest.”
The article continues that gifted children were left to their own devices, because they understood course material and that schools fail to have programs for gifted children, though they have special education classes. Lubinski concluded that if children were challenged it would motivate them to push themselves farther.
No argument with the study, except did they take economic, environment, and socioeconomic levels into consideration? What other factors helped these children achieve success and how exactly was a “roadblock” defined?
Whitney Grace, January 31, 2014
January 7, 2014
There is a troubling article over at Priceonomics titled, “Fraud in the Ivory Tower.” The post begins with the tale of former Tilburg University professor Diederik Stapel, who was found in 2012 to have fabricated or manipulated data in at least 30 papers that had been published in peer-reviewed journals. This case is a dramatic example of a growing problem; Fang Labs reports that instances of fraud or suspected fraud tripled between the 2002-2006 period and 2007-2011. Why the uptick?
We’re reminded that the famed “publish or perish” academic culture grows ever more demanding. At the same time, policies at scientific journals often mean that research integrity takes a back seat to provocative assertions.
“According to experimental psychologist Chris Chambers, high-impact journals (particularly in the field of psychology) look for results that are ‘exciting, eye-opening, even implausible.’ Novelty pieces. As psychologist Joseph Simmons told the science journal Nature: ‘When we review papers, we’re often making authors prove that their findings are novel or interesting. We’re not often making them prove that their findings are true.’”
Lovely. The write-up goes on to reveal that retractions are on the rise; the PubMed database contained only three publication retractions in 2000, but 180 in 2009. What’s more, these retractions are occurring most often at journals with high prestige (as measured by how often its papers are cited in other works).
The article states:
“Again, it is possible that this increase is caused by a stronger online watchdog culture. But regardless of whether the fraud is new or newly discovered, the case of Diederik Stapel reveals the ugly underbelly of scientific research. The pressure to publish frequently in prestigious journals has made it more likely for researchers to cut corners and manipulate data.”
The piece naturally concludes with a call for improvement. In doing so, the writer supplies this link to an article advocating open access to academic papers. Interesting.
Cynthia Murrell, January 07, 2014
December 2, 2013
Here is an interesting approach to academic freedom. The Chicago Tribune informs us that “Chicago State University Wants Faculty Blog Shut Down.” The blog in question, the Faculty Voice Blog, has dared to be critical of the University administration, so the school and its lawyers have sent an official “cease and desist” notice. Rather than engage the unhappy professors in civil debate, it seems the school has suddenly decided it has a problem with the blog’s use of its trademarks and trade names. (The blog has been active, and using these “trade names and marks,” since 2009.) The notice also characterizes the posts as unprofessional and uncivil, thereby violating University policy. No word on why they feel their policy trumps the First Amendment to the U.S. Constitution.
Reporter Juan Perez Jr. cites Phillip Beverly, the associate political science professor who founded the blog. The article relates:
“Roughly eight faculty members contributed to the site, Beverly said, under their own names or pseudonyms. The website used a picture of an on-campus Chicago State University sign and ‘CSU’ hedge sculpture. But Monday evening, after receiving the letter, Beverly changed the site’s name to ‘Crony State University’ and replaced its main image with a building from another campus.”
That’s one way to deal with specious charges (don’t worry, he is also consulting a lawyer). Beverly started the website specifically to provide a forum for discussing problems at the University, like disappointing graduation rates, poor money management, and inadequate leadership. For its part, the school seems to feel it has the right to dictate what information makes it into the public sphere.
“Last year, Chicago State officials instructed faculty and staff that only authorized university representatives could share information with the media—and that everything from opinion pieces to social media communications could require prior approval.”
This is not the first time Chicago State has run afoul of the First Amendment. Just last year, a federal judge decided the school had violated rights granted by that hallowed document when it fired another outspoken professor, Steven Moore. Perhaps University administrators should audit a few classes on constitutional law.
Cynthia Murrell, December 02, 2013
October 18, 2013
Science magazine has published an important article about today’s open-access academic journals— Who’s Afraid of Peer Review?” I highly recommend reading the entire piece, but I’ll share some highlights here. Journalist John Bohannon begins:
“On 4 July, good news arrived in the inbox of Ocorrafoo Cobange, a biologist at the Wassee Institute of Medicine in Asmara. It was the official letter of acceptance for a paper he had submitted 2 months earlier to the Journal of Natural Pharmaceuticals, describing the anticancer properties of a chemical that Cobange had extracted from a lichen.
“In fact, it should have been promptly rejected. Any reviewer with more than a high-school knowledge of chemistry and the ability to understand a basic data plot should have spotted the paper’s short-comings immediately. Its experiments are so hopelessly flawed that the results are meaningless. I know because I wrote the paper.”
You see, Science performed an elaborate sting operation across the rapidly growing field of open-access journal publication. Most of these journals make money by charging authors upon acceptance of their articles; Bohannon began to suspect a number of these publications were motivated to accept papers that would not stand up to rigorous peer review, despite assertions on their websites to the contrary. What he found is truly disheartening.
See the article for the methodology behind the fake paper and Bohannon’s submissions procedure, both of which are informative in themselves. The results are disheartening. When the article went to press, far more journals (157) had accepted the bogus paper than rejected it (98). Even respected publishers like Elsevier and Sage were found to host at least one of these questionable journals. Most of the publishers that performed any review at all focused on mechanical issues like formatting, not substance. What is going on here? Bohannon offers:
“A striking picture emerges from the global distribution of open-access publishers, editors, and bank accounts. Most of the publishing operations cloak their true geographic location. They create journals with names like the American Journal of Medical and Dental Sciences or the European Journal of Chemistry to imitate—and in some cases, literally clone—those of Western academic publishers. But the locations revealed by IP addresses and bank invoices are continents away: Those two journals are published from Pakistan and Turkey, respectively, and both accepted the paper…
“About one-third of the journals targeted in this sting are based in India—overtly or as revealed by the location of editors and bank accounts—making it the world’s largest base for open-access publishing; and among the India-based journals in my sample, 64 accepted the fatally flawed papers and only 15 rejected it.”
So, opportunists in the developing world have seized upon faux-reviewed academic publishing as the way to turn a PC and an Internet connection into profits. Good for them, bad for science. How does one know when Bing or Google links to fake info? Does it matter anymore? I have to think it does. I hope that people in the field, like Bohannon, who care about open access to legitimate research will find a way to counter this flood of bad information. In the meantime, well… don’t believe every link you read.
Cynthia Murrell, October 18, 2013
October 14, 2013
Academic publishing juggernaut Elsevier produces a resource for journal editors called, reasonably enough, Editors’ Update. Retraction Watch calls our attention to the Update’s recent series on ethics in, “Editor: ‘Close to 10% of the Papers We Receive Show Some Sign of Academic Misconduct’.” While the Elsevier ethics series also addresses topics like bias and research misconduct, it is the prevalence of plagiarism that concerns Retraction Watch’s Ivan Oransky. He pulls this quote from a piece in the series by Henrik Rudolph, editor in chief of Applied Surface Science:
“Close to 10% of the papers we receive show some sign of academic misconduct, but since the total number of submissions is increasing, the absolute number is also rising. The most common issue we see is too large an overlap with previously published material, i.e. plagiarism. Cases are evenly divided between self-plagiarism and regular plagiarism. These submissions are most often identified in the editorial phase (by the managing editor or editor) and are rejected before they are sent out for review. iThenticate is an important instrument for detecting academic misconduct, but often common sense is an equally important instrument. . . . If it looks fishy it probably is fishy.”
Examples of fishy-looking content include a sudden shift from U.K. to U.S. English and spelling errors copied straight from the original. Oransky supplies descriptions of the articles to be found in part one of the Elsevier Ethics Special Edition, as well as a brief blurb on what to expect from part two. Check it out for more on this unsettling tendency.
Cynthia Murrell, October 14, 2013
September 27, 2013
For those of us that have been living under a rock (myself included) Khan Academy is the latest and greatest thing to come to mathematics education. According to a recent Learn and Teach Statistics blog post “Khan Academy Statistics Videos Are Not Good.”
After viewing a sampling of the Khan Academy statistics videos, the author concludes:
“My main criticism is that the video is dull. It doesn’t provide anything more than the mathematics. But apart from alienating non-mathematical students it isn’t harmful. In fact if I had a student who wanted to know the mathematics behind the statistics, I would be happy to send them there. People have commented that my videos don’t tell you how the p-value is calculated. This is true. That is not the aim. Maybe I’ll do one about that one day, but I figured it was more important to know what to do with one.”
This is an excellent example of why pouring money into a project doesn’t automatically make it good or useful. While video teaching in itself is not a problem. Videos that are not helpful are.
Jasmine Ashton, September 27, 2013
September 16, 2013
The academic community is supposed to represent integrity, research, and knowledge. When a project goes awry, researchers can understandably get upset, because it could mean several things are on the line: job, funding, tenure, etc. In order to make the findings go the way they want, researchers may be tempted to falsify data. A recent post on Slashdot points to a questionable academic situation: “Request To Falsify Data Published In Chemistry Data.” Is this one situation where data was falsified? Read the original post:
“A note inadvertently left in the ‘supplemental information’ of a journal article appears to instruct a subordinate scientist to fabricate data. Quoting: ‘The first author of the article, “Synthesis, Structure, and Catalytic Studies of Palladium and Platinum Bis-Sulfoxide Complexes,” published online ahead of print in the American Chemical Society (ACS) journal Organometallics, is Emma E. Drinkel of the University of Zurich in Switzerland. The online version of the article includes a link to this supporting information file. The bottom of page 12 of the document contains this instruction: “Emma, please insert NMR data here! where are they? and for this compound, just make up an elemental analysis …” We are making no judgments here. We don’t know who wrote this, and some commenters have noted that “just make up” could be an awkward choice of words by a non-native speaker of English who intended to instruct his student to make up a sample and then conduct the elemental analysis. Other commenters aren’t buying it.’”
“Make up an elemental analysis…,” does that statement sound credible to you? Researchers are supposed to question and analyze every iota of data until there is nothing left to explore. Making something up only leads to false data and will cause future studies to be inaccurate. Is this how all academics are or is it just an isolated incident?
Whitney Grace, September 16, 2013