//收集了一小部分,忘记的时候过来查一下
java--hadoop部分
/** * 此类用来处理DNS原始日志:统计给定域名平均响应时延 * @param Input * @param Output * @param cacheUriListfilePath * @param cacheIpNetTypefilePath * <br>[文件cachefile需要上传HDFS,文件为K-V形式,多个V用;隔开]</br> * * <P><B>NOTE:</B>该类适合限定域名的时延统计,若统计所有域名的的平均时延此类不适用,因为reducer类使用集合进行汇聚,所有域名会导致内存溢出</p> */
<P>是单独起个段落 (注意和<br>换行、<pre>再起一个段落 比较)
<B>是加黑加粗
@param是参数
@author yanghl 作者
/** * This is an example Aggregated Hadoop Map/Reduce application. Computes the * histogram of the words in the input texts. * * To run: bin/hadoop jar hadoop-*-examples.jar aggregatewordhist <i>in-dir</i> * <i>out-dir</i> <i>numOfReducers</i> textinputformat * */
<i>是倾斜体,表示路径
/** * Creates a <code>Statement</code> object for sending * SQL statements to the database. * SQL statements without parameters are normally * executed using <code>Statement</code> objects. If the same SQL statement * is executed many times, it may be more efficient to use a * <code>PreparedStatement</code> object. * <P> * Result sets created using the returned <code>Statement</code> * object will by default be type <code>TYPE_FORWARD_ONLY</code> * and have a concurrency level of <code>CONCUR_READ_ONLY</code>. * The holdability of the created result sets can be determined by * calling {@link #getHoldability}. * * @return a new default <code>Statement</code> object * @exception SQLException if a database access error occurs * or this method is called on a closed connection */
@return
@exception
{@link #getHoldability}.加链接 getHoldability()本包的一个方法
/** Holds a <url, referrer, time > tuple */ static class AccessRecord implements Writable, DBWritable {.....}
========================
< >注释中的<>表示的另一种方法
========================
/** * <P>The basic service for managing a set of JDBC drivers.<br> * <B>NOTE:</B> The {@link <code>DataSource</code>} interface, new in the * JDBC 2.0 API, provides another way to connect to a data source. * The use of a <code>DataSource</code> object is the preferred means of * connecting to a data source. * * <P>As part of its initialization, the <code>DriverManager</code> class will * attempt to load the driver classes referenced in the "jdbc.drivers" * system property. This allows a user to customize the JDBC Drivers * used by their applications. For example in your * ~/.hotjava/properties file you might specify: * <pre> * <CODE>jdbc.drivers=foo.bah.Driver:wombat.sql.Driver:bad.taste.ourDriver</CODE> * </pre> */
===============================
【br、P、pre、code、B】标签
<br>换行
<pre>再起一个段落
===============================
/** * A tool interface that supports handling of generic command-line options. * * <p><code>Tool</code>, is the standard for any Map-Reduce tool/application. * The tool/application should delegate the handling of * <a href="{@docRoot}/../hadoop-project-dist/hadoop-common/CommandsManual.html#Generic_Options"> * standard command-line options</a> to {@link ToolRunner#run(Tool, String[])} * and only handle its custom arguments.</p> * * <p>Here is how a typical <code>Tool</code> is implemented:</p> * <p><blockquote><pre> * public class MyApp extends Configured implements Tool { * * public int run(String[] args) throws Exception { * // <code>Configuration</code> processed by <code>ToolRunner</code> * Configuration conf = getConf(); * * // Create a JobConf using the processed <code>conf</code> * JobConf job = new JobConf(conf, MyApp.class); * * // Process custom command-line options * Path in = new Path(args[1]); * Path out = new Path(args[2]); * * // Specify various job-specific parameters * job.setJobName("my-app"); * job.setInputPath(in); * job.setOutputPath(out); * job.setMapperClass(MyMapper.class); * job.setReducerClass(MyReducer.class); * * // Submit the job, then poll for progress until the job is complete * JobClient.runJob(job); * return 0; * } * * public static void main(String[] args) throws Exception { * // Let <code>ToolRunner</code> handle generic command-line options * int res = ToolRunner.run(new Configuration(), new MyApp(), args); * * System.exit(res); * } * } * </pre></blockquote></p> * * @see GenericOptionsParser * @see ToolRunner */
@InterfaceAudience.Public
@InterfaceStability.Stable
public interface Tool extends Configurable {
【<p><blockquote><pre>】
还可以在代码注释里画表格、列表
/** * Rounding mode to round away from zero. Always increments the * digit prior to a non-zero discarded fraction. Note that this * rounding mode never decreases the magnitude of the calculated * value. * *<p>Example: *<table border> *<tr valign=top><th>Input Number</th> * <th>Input rounded to one digit<br> with {@code UP} rounding *<tr align=right><td>5.5</td> <td>6</td> *<tr align=right><td>2.5</td> <td>3</td> *<tr align=right><td>1.6</td> <td>2</td> *<tr align=right><td>1.1</td> <td>2</td> *<tr align=right><td>1.0</td> <td>1</td> *<tr align=right><td>-1.0</td> <td>-1</td> *<tr align=right><td>-1.1</td> <td>-2</td> *<tr align=right><td>-1.6</td> <td>-2</td> *<tr align=right><td>-2.5</td> <td>-3</td> *<tr align=right><td>-5.5</td> <td>-6</td> *</table> */
<ul> <li>a</li> <li>a</li> <li>a</li>
</ul>
还可以在代码注释里【写代码或配置文件内容】{@code xxxx....}
/** * This program uses map/reduce to just run a distributed job where there is * no interaction between the tasks and each task write a large unsorted * random binary sequence file of BytesWritable. * In order for this program to generate data for terasort with 10-byte keys * and 90-byte values, have the following config: * <pre>{@code * <?xml version="1.0"?> * <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> * <configuration> * <property> * <name>mapreduce.randomwriter.minkey</name> * <value>10</value> * </property> * <property> * <name>mapreduce.randomwriter.maxkey</name> * <value>10</value> * </property> * <property> * <name>mapreduce.randomwriter.minvalue</name> * <value>90</value> * </property> * <property> * <name>mapreduce.randomwriter.maxvalue</name> * <value>90</value> * </property> * <property> * <name>mapreduce.randomwriter.totalbytes</name> * <value>1099511627776</value> * </property> * </configuration>}</pre> * Equivalently, {@link RandomWriter} also supports all the above options * and ones supported by {@link GenericOptionsParser} via the command-line. */