• Log Parser Studio 分析 IIS 日志


    Log Parser Studio 分析 IIS 日志

    来源 https://www.cnblogs.com/lonelyxmas/p/8671336.html

    软件下载地址:

    Log Parser 2.2 

    Download: https://www.microsoft.com/en-us/download/details.aspx?displaylang=en&id=24659

    Log Parser Studio

    download: https://gallery.technet.microsoft.com/Log-Parser-Studio-cd458765

    简单使用说明:

    1. 安装 LogParser.msi

    2. 解压 LPSV2.D2.zip

    3. 运行 Log Parser Studio

    运行 LPS.exe 程序。

    4. 指定IIS日志文件路径

    5. 新建日志查询

    6. 查看帮助信息

    /*加载最慢的2000个记录*/
    SELECT TOP 100 cs-uri-stem,time-taken,TO_TIMESTAMP(date, time) AS Timestamp
    FROM '[LOGFILEPATH]' 
    WHERE time-taken > 2000
    order by time-taken desc
    /*特定时间内的请求数倒序列表*/
    SELECT TOP 100 cs-uri-stem,time-taken,c-ip,TO_TIMESTAMP(date, time) AS Timestamp
    FROM '[LOGFILEPATH]' 
    WHERE time-taken > 2000  and to_time(time) > '00:00:05' and to_time(time) < '23:00:15'
    order by time-taken desc

    使用指南:

    https://blogs.technet.microsoft.com/exchange/2012/03/07/introducing-log-parser-studio/

    Introducing: Log Parser Studio

    download: https://gallery.technet.microsoft.com/Log-Parser-Studio-cd458765

    Anyone who regularly uses Log Parser 2.2 knows just how useful and powerful it can be for obtaining valuable information from IIS (Internet Information Server) and other logs. In addition, adding the power of SQL allows explicit searching of gigabytes of logs returning only the data that is needed while filtering out the noise. The only thing missing is a great graphical user interface (GUI) to function as a front-end to Log Parser and a ‘Query Library’ in order to manage all those great queries and scripts that one builds up over time.

    Log Parser Studio was created to fulfill this need; by allowing those who use Log Parser 2.2 (and even those who don’t due to lack of an interface) to work faster and more efficiently to get to the data they need with less “fiddling” with scripts and folders full of queries.

    With Log Parser Studio (LPS for short) we can house all of our queries in a central location. We can edit and create new queries in the ‘Query Editor’ and save them for later. We can search for queries using free text search as well as export and import both libraries and queries in different formats allowing for easy collaboration as well as storing multiple types of separate libraries for different protocols.

    Processing Logs for Exchange Protocols

    We all know this very well: processing logs for different Exchange protocols is a time consuming task. In the absence of special purpose tools, it becomes a tedious task for an Exchange Administrator to sift thru those logs and process them using Log Parser (or some other tool), if output format is important. You also need expertise in writing those SQL queries. You can also use special purpose scripts that one can find on the web and then analyze the output to make some sense of out of those lengthy logs. Log Parser Studio is mainly designed for quick and easy processing of different logs for Exchange protocols. Once you launch it, you’ll notice tabs for different Exchange protocols, i.e. Microsoft Exchange ActiveSync (MAS), Exchange Web Services (EWS), Outlook Web App (OWA/HTTP) and others. Under those tabs there are tens of SQL queries written for specific purposes (description and other particulars of a query are also available in the main UI), which can be run by just one click!

    Let’s get into the specifics of some of the cool features of Log Parser Studio …

    Query Library and Management

    Upon launching LPS, the first thing you will see is the Query Library preloaded with queries. This is where we manage all of our queries. The library is always available by clicking on the Library tab. You can load a query for review or execution using several methods. The easiest method is to simply select the query in the list and double-click it. Upon doing so the query will auto-open in its own Query tab. The Query Library is home base for queries. All queries maintained by LPS are stored in this library. There are easy controls to quickly locate desired queries & mark them as favorites for quick access later.

    image

    Library Recovery

    The initial library that ships with LPS is embedded in the application and created upon install. If you ever delete, corrupt or lose the library you can easily reset back to the original by using the recover library feature (Options | Recover Library). When recovering the library all existing queries will be deleted. If you have custom/modified queries that you do not want to lose, you should export those first, then after recovering the default set of queries, you can merge them back into LPS.

    Import/Export

    Depending on your need, the entire library or subsets of the library can be imported and exported either as the default LPS XML format or as SQL queries. For example, if you have a folder full of Log Parser SQL queries, you can import some or all of them into LPS’s library. Usually, the only thing you will need to do after the import is make a few adjustments. All LPS needs is the base SQL query and to swap out the filename references with ‘[LOGFILEPATH]’ and/or ‘[OUTFILEPATH]’ as discussed in detail in the PDF manual included with the tool (you can access it via LPS | Help | Documentation).

    Queries

    Remember that a well-written structured query makes all the difference between a successful query that returns the concise information you need vs. a subpar query which taxes your system, returns much more information than you actually need and in some cases crashes the application.

    image

    The art of creating great SQL/Log Parser queries is outside the scope of this post, however all of the queries included with LPS have been written to achieve the most concise results while returning the fewest records. Knowing what you want and how to get it with the least number of rows returned is the key!

    Batch Jobs and Multithreading

    You’ll find that LPS in combination with Log Parser 2.2 is a very powerful tool. However, if all you could do was run a single query at a time and wait for the results, you probably wouldn’t be making near as much progress as you could be. In lieu of this LPS contains both batch jobs and multithreaded queries.

    A batch job is simply a collection of predefined queries that can all be executed with the press of a single button. From within the Batch Manager you can remove any single or all queries as well as execute them. You can also execute them by clicking the Run Multiple Queries button or the Execute button in the Batch Manager. Upon execution, LPS will prepare and execute each query in the batch. By default LPS will send ALL queries to Log Parser 2.2 as soon as each is prepared. This is where multithreading works in our favor. For example, if we have 50 queries setup as a batch job and execute the job, we’ll have 50 threads in the background all working with Log Parser simultaneously leaving the user free to work with other queries. As each job finishes the results are passed back to the grid or the CSV output based on the query type. Even in this scenario you can continue to work with other queries, search, modify and execute. As each query completes its thread is retired and its resources freed. These threads are managed very efficiently in the background so there should be no issue running multiple queries at once.

    image

    Now what if we did want the queries in the batch to run concurrently for performance or other reasons? This functionality is already built-into LPS’s options. Just make the change in LPS | Options | Preferences by checking the ‘Process Batch Queries in Sequence’ checkbox. When checked, the first query in the batch is executed and the next query will not begin until the first one is complete. This process will continue until the last query in the batch has been executed.

    Automation

    In conjunction with batch jobs, automation allows unattended scheduled automation of batch jobs. For example we can create a scheduled task that will automatically run a chosen batch job which also operates on a separate set of custom folders. This process requires two components, a folder list file (.FLD) and a batch list file (.XML). We create these ahead of time from within LPS. For more details on how to do that, please refer to the manual.

    Charts

    Many queries that return data to the Result Grid can be charted using the built-in charting feature. The basic requirements for charts are the same as Log Parser 2.2, i.e.

    1. The first column in the grid may be any data type (string, number etc.)
    2. The second column must be some type of number (Integer, Double, Decimal), Strings are not allowed

    Keep the above requirements in mind when creating your own queries so that you will consciously write the query to include a number for column two. To generate a chart click the chart button after a query has completed. For #2 above, even if you forgot to do so, you can drag any numbered column and drop it in the second column after the fact. This way if you have multiple numbered columns, you can simply drag the one that you’re interested in, into second column and generate different charts from the same data. Again, for more details on charting feature, please refer to the manual.

    image

    Keyboard Shortcuts/Commands

    There are multiple keyboard shortcuts built-in to LPS. You can view the list anytime while using LPS by clicking LPS | Help | Keyboard Shortcuts. The currently included shortcuts are as follows:

    ShortcutWhat it does
    CTRL+N Start a new query.
    CTRL+S Save active query in library or query tab depending on which has focus.
    CTRL+Q Open library window.
    CTRL+B Add selected query in library to batch.
    ALT+B Open Batch Manager.
    CTRL+B Add the selected queries to batch.
    CTRL+D Duplicates the current active query to a new tab.
    CTRL+ALT+E Open the error log if one exists.
    CTRL+E Export current selected query results to CSV.
    ALT+F Add selected query in library to the favorites list.
    CTRL+ALT+L Open the raw Library in the first available text editor.
    CTRL+F5 Reload the Library from disk.
    F5 Execute active query.
    F2 Edit name/description of currently selected query in the Library.
    F3 Display the list of IIS fields.

    Supported Input and Output types

    Log Parser 2.2 has the ability to query multiple types of logs. Since LPS is a work in progress, only the most used types are currently available. Additional input and output types will be added when possible in upcoming versions or updates.

    Supported Input Types

    Full support for W3SVC/IIS, CSV, HTTP Error and basic support for all built-in Log Parser 2.2 input formats. In addition, some custom written LPS formats such as Microsoft Exchange specific formats that are not available with the default Log Parser 2.2 install.

    Supported Output Types

    CSV and TXT are the currently supported output file types.

    Log Parser Studio - Quick Start Guide

    Want to skip all the details & just run some queries right now? Start here …

    The very first thing Log Parser Studio needs to know is where the log files are, and the default location that you would like any queries that export their results as CSV files to be saved.

    1. Setup your default CSV output path:

    a. Go to LPS | Options | Preferences | Default Output Path.

    b. Browse to and select the folder you would like to use for exported results.

    c. Click Apply.

    d. Any queries that export CSV files will now be saved in this folder. 
    NOTE: If you forget to set this path before you start the CSV files will be saved in %AppData%MicrosoftLog Parser Studio by default but it is recommended that you move this to another location.

    2. Tell LPS where the log files are by opening the Log File Manager. If you try to run a query before completing this step LPS will prompt and ask you to set the log path. Upon clicking OK on that prompt, you are presented with the Log File Manager. Click Add Folder to add a folder or Add File to add a single or multiple files. When adding a folder you still must select at least one file so LPS will know which type of log we are working with. When doing so, LPS will automatically turn this into a wildcard (*.xxx) Indicating that all matching logs in the folder will be searched.

    You can easily tell which folder or files are currently being searched by examining the status bar at the bottom-right of Log Parser Studio. To see the full path, roll your mouse over the status bar.

    NOTE: LPS and Log Parser handle multiple types of logs and objects that can be queried. It is important to remember that the type of log you are querying must match the query you are performing. In other words, when running a query that expects IIS logs, only IIS logs should be selected in the File Manager. Failure to do this (it’s easy to forget) will result errors or unexpected behavior will be returned when running the query.

    3. Choose a query from the library and run it:

    a. Click the Library tab if it isn’t already selected.

    b. Choose a query in the list and double-click it. This will open the query in its own tab.

    c. Click the Run Single Query button to execute the query

    The query execution will begin in the background. Once the query has completed there are two possible outputs targets; the result grid in the top half of the query tab or a CSV file. Some queries return to the grid while other more memory intensive queries are saved to CSV.

    As a general rule queries that may return very large result sets are probably best served going to a CSV file for further processing in Excel. Once you have the results there are many features for working with those results. For more details, please refer to the manual.

    Have fun with Log Parser Studio! & always remember – There’s a query for that!

    Kary Wall 
    Escalation Engineer 
    Microsoft Exchange Support

    ====================================  End

  • 相关阅读:
    就没有我遇不到的报错!java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/filter/Filter
    【HBase】通过Bulkload批量加载数据到Hbase表中
    【HBase】HBase和Hue的整合
    【HBase】协处理器是什么?又能干什么?怎么用?
    【HBase】带你了解一哈HBase的各种预分区
    【HBase】快速了解上手rowKey的设计技巧
    【HBase】HBase和Sqoop整合
    【HBase】快速搞定HBase与Hive的对比、整合
    hive元数据报错?试了很多方法都没辙?也许你漏了这一步
    【HBase】HBase与MapReduce集成——从HDFS的文件读取数据到HBase
  • 原文地址:https://www.cnblogs.com/lsgxeva/p/10477908.html
Copyright © 2020-2023  润新知