Comments on: Loading Files into Hdfs Using Flume’s Spool Directory https://acadgild.com/blog/loading-files-into-hdfs-using-flumes-spool-directory/ Learn. Do. Earn. Thu, 25 Aug 2016 12:12:23 +0000 hourly 1 https://wordpress.org/?v=4.5.3 By: raju ega https://acadgild.com/blog/loading-files-into-hdfs-using-flumes-spool-directory/#comment-3421 Thu, 11 Aug 2016 07:24:33 +0000 https://acadgild.com/blog/?p=1095#comment-3421 Is there a way to loading files from Web sphere MQ to hdfs using flume? Please help me in this regards.

]]>
By: Berk https://acadgild.com/blog/loading-files-into-hdfs-using-flumes-spool-directory/#comment-3194 Tue, 19 Jul 2016 11:28:38 +0000 https://acadgild.com/blog/?p=1095#comment-3194 It is my bad. I can download the file. Thanks anyway

]]>
By: Berk https://acadgild.com/blog/loading-files-into-hdfs-using-flumes-spool-directory/#comment-3193 Tue, 19 Jul 2016 11:26:23 +0000 https://acadgild.com/blog/?p=1095#comment-3193 Thanks for sharing this informative article. But there is a problem with the googledrive link. When I open it an warning pops up like ‘no file in directory’. Could you please check the file?

Kind regards

]]>
By: Andy https://acadgild.com/blog/loading-files-into-hdfs-using-flumes-spool-directory/#comment-2500 Fri, 27 May 2016 17:37:23 +0000 https://acadgild.com/blog/?p=1095#comment-2500 hi,
i see in the spooling directory(source) the files as COMPLITED , but nothing in flume_sink of HDFS

]]>
By: umar https://acadgild.com/blog/loading-files-into-hdfs-using-flumes-spool-directory/#comment-2174 Thu, 12 May 2016 06:15:34 +0000 https://acadgild.com/blog/?p=1095#comment-2174 when i run the agent its shows the below error

A fatal error occurred while running. Exception follows.
org.apache.commons.cli.MissingOptionException: Missing required option: n
at org.apache.commons.cli.Parser.checkRequiredOptions(Parser.java:299)
at org.apache.commons.cli.Parser.parse(Parser.java:231)
at org.apache.commons.cli.Parser.parse(Parser.java:85)
at org.apache.flume.node.Application.main(Application.java:265

]]>
By: RAM https://acadgild.com/blog/loading-files-into-hdfs-using-flumes-spool-directory/#comment-2132 Sun, 08 May 2016 04:49:55 +0000 https://acadgild.com/blog/?p=1095#comment-2132 very good article

]]>
By: Prateek Kumar https://acadgild.com/blog/loading-files-into-hdfs-using-flumes-spool-directory/#comment-1909 Fri, 22 Apr 2016 07:25:23 +0000 https://acadgild.com/blog/?p=1095#comment-1909 Hi Nilesh

Thanks for feedback.

There are several jars for different sources in flume.
We use specific jar for twitter and a different jar for spool directory.
Likewise there are n numbers of jar files available for flume.

Here is the link for all the available sources
https://flume.apache.org/FlumeUserGuide.html#flume-sources

Hope this link will be helpful for you.

]]>
By: Nilesh https://acadgild.com/blog/loading-files-into-hdfs-using-flumes-spool-directory/#comment-1908 Fri, 22 Apr 2016 07:04:05 +0000 https://acadgild.com/blog/?p=1095#comment-1908 Nice artical Prateek

Cloud you explain when and why to use which sources in flume.

]]>
By: AcadGild https://acadgild.com/blog/loading-files-into-hdfs-using-flumes-spool-directory/#comment-490 Mon, 04 Jan 2016 14:47:45 +0000 https://acadgild.com/blog/?p=1095#comment-490 Hi Arjun,

If you want to delete the record from hbase through hive storage handler , then hive table should support transactions.
But Hive transactions have very limited support , we can perform transactions only if
1. Table is stored in ORC file format
2. Table is Bucketed at least one column

For more details on hive transaction support refer the below blog
https://acadgild.com/blog/transactions-in-hive/

]]>
By: Arjun https://acadgild.com/blog/loading-files-into-hdfs-using-flumes-spool-directory/#comment-488 Mon, 04 Jan 2016 13:38:41 +0000 https://acadgild.com/blog/?p=1095#comment-488 Nice read , I have a question on the integrated tables. Basically i want to understand the below two scenarios
1. What if i want to delete a record from HBase ? Can i trigger a HQL query in Hive(1.0 or higher) and get this done ?
2. Does this support transaction ? if i am inserting 100 records, all 100 should go in if successful or 0 in case of failure

]]>