Skip to main content

A tricky exception running MapReduce functions through RHadoop: root cause and how to fix it.

RHadoop (https://github.com/RevolutionAnalytics/RHadoop/wiki) is a collection of five R packages (rhdfs, rmr2, rhbase, ravro, plyrmr) that allow users to manage and analyze data with Hadoop. Running any MapReduce function, also this simple one

    from.dfs(mapreduce(to.dfs(1:100))) 

through RHadoop on Linux servers you could face this exception:

2015-10-20 08:39:41,722 ERROR [main] org.apache.hadoop.streaming.PipeMapRed: configuration exception
java.io.IOException: Cannot run program "Rscript": error=2, No such file or directory
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1059)
    at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:209)
    at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
    at java.lang.reflect.Method.invoke(Method.java:620)
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
    at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:56)
    at java.lang.reflect.Method.invoke(Method.java:620)
    at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:449)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(AccessController.java:452)
    at javax.security.auth.Subject.doAs(Subject.java:572)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.io.IOException: error=2, No such file or directory
    at java.lang.UNIXProcess.<init>(UNIXProcess.java:188)
    at java.lang.ProcessImpl.start(ProcessImpl.java:164)
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1040)
    ... 24 more

   
At first glance one could think of a missing permission issue on the Rscript tool for the user running Hadoop (the file cannot be missing because it is part of the R environment and the MapReduce function has been triggered by a R console). But this issue happens also when the permissions for that user are r-x for the Rscript file. The root cause for this issue is the following: RHadoop/Hadoop tries to to execute the Rscript tool from the /usr/bin/ location when it is really in the $R_HOME/bin/ directory. The solution is simple. As root user create a symbolic link to the Rscript file this way:

ln -s $R_HOME/bin/Rscript /usr/bin/Rscript

This solution works on Red Hat and CentOS, but I suppose it should work on any Linux distro.

Comments

Popular posts from this blog

jOOQ: code generation in Eclipse

jOOQ allows code generation from a database schema through ANT tasks, Maven and shell command tools. But if you're working with Eclipse it's easier to create a new Run Configuration to perform this operation. First of all you have to write the usual XML configuration file for the code generation starting from the database: <?xml version="1.0" encoding="UTF-8" standalone="yes"?> <configuration xmlns="http://www.jooq.org/xsd/jooq-codegen-2.0.4.xsd">   <jdbc>     <driver>oracle.jdbc.driver.OracleDriver</driver>     <url>jdbc:oracle:thin:@dbhost:1700:DBSID</url>     <user>DB_FTRS</user>     <password>password</password>   </jdbc>   <generator>     <name>org.jooq.util.DefaultGenerator</name>     <database>       <name>org.jooq.util.oracle.OracleDatabase</name>     ...

Turning Python Scripts into Working Web Apps Quickly with Streamlit

 I just realized that I am using Streamlit since almost one year now, posted about in Twitter or LinkedIn several times, but never wrote a blog post about it before. Communication in Data Science and Machine Learning is the key. Being able to showcase work in progress and share results with the business makes the difference. Verbal and non-verbal communication skills are important. Having some tool that could support you in this kind of conversation with a mixed audience that couldn't have a technical background or would like to hear in terms of results and business value would be of great help. I found that Streamlit fits well this scenario. Streamlit is an Open Source (Apache License 2.0) Python framework that turns data or ML scripts into shareable web apps in minutes (no kidding). Python only: no front‑end experience required. To start with Streamlit, just install it through pip (it is available in Anaconda too): pip install streamlit and you are ready to execute the working de...

TagUI: an Excellent Open Source Option for RPA - Introduction

 Photo by Dinu J Nair on Unsplash Today I want to introduce  TagUI , an RPA (Robotic Process Automation) Open Source tool I am using to automate test scenarios for web applications. It is developed and maintained by the AI Singapore national programme. It allows writing flows to automate repetitive tasks, such as regression testing of web applications. Flows are written in natural language : English and other 20 languages are currently supported. Works on Windows, Linux and macOS. The TagUI official documentation can be found  here . The tool doesn't require installation: just go the official GitHub repository and download the archive for your specific OS (ZIP for Windows, tar.gz for Linux or macOS). After the download is completed, unpack its content in the local hard drive. The executable to use is named  tagui  (.cmd in Windows, .sh for other OS) and it is located into the  <destination_folder>/tagui/src  directory. In order to ...