Help Docs

Export and share logs

Site24x7 provides robust log management capabilities, allowing you to export log data for further analysis or sharing. You can download log entries in CSV format, providing flexibility for reporting, troubleshooting, and deeper insights. The number of log entries you can export depends on your subscription plan—higher-tier plans support up to one million entries per export, while other plans support up to 100,000 entries.

Export logs

To export logs:

  1. Log in to Site24x7.
  2. Navigate to AppLogs.
  3. Enter the query and press Enter or click the Search Search icon icon.
  4. Click the Share Share logs icon. The share option will be unavailable if no data is found.
  5. Choose the log entries to download or export:
    1. Loaded Data: Download the currently loaded log entries in the client in either CSV or PDF format.
    2. All Entries: Download up to 10,000 log entries in CSV format.
    3. Export (Up to the allowed number of entries): Export log data in CSV format.
  6. Click the Download button.
Note
  • PDF format is only available for the Loaded Data option. CSV format is available for all other options.
  • Export is available only for Professional (up to 100,000 entries) and Enterprise (up to one million entries) accounts.
  • The Export option can be accessed only by users with Admin, Super Admin, or Operator roles.

History of exports

The History tab provides a record of all previous log exports. From this tab, you can:

  • View the date and time of each export.
  • Download previously exported log data again.

Exported log data is retained for five days, allowing you to download it again within this period.

If you need any assistance with exporting logs, please contact our support team.

Email option

You can export logs in PDF format and share them via email, but this is supported only for the logs currently loaded in the interface. This option is useful for quick sharing of log data directly from the system.

Exporting logs over one million entries via REST API

There may be instances where you need to export more than one million log entries. In such cases, you can use the script below to download the logs via the REST API. Please note that the download process may be slow, as each API call retrieves only 1,000 records. The script handles this by recursively calling the API and writing the data to a CSV file.

Note

Make sure to update the USER INPUTS section in the script before executing it to download the logs.

import os
##### USER INPUTS ###############################################
client_id = "1000.XXXXX"
client_secret = "XXXXXXXX"
code = "1000.XXXXX.YYYYYY"
#format DD-MM-YYYY HH:mm:ss
start_time = "11-11-2024 00:00:01"
end_time = "11-11-2024 23:59:59"
#change the query and delimit the characters wherever applicable
query = "logtype=\"Infrastructure Events\""
#change the location to where the output must be stored
# if the following value is to be changed give the value as string example  "/home/user/Documents/Project/Bulk_Export"
destination_path = os.getcwd() 
#change the name of the output file
filename = "output"
##################################################################
import time
import re
try:
    import requests
except ImportError:
    try:
        os.system('python3 -m pip install requests')
        import requests
    except Exception as e:
        print(e)
global access_token
path = os.path.join(destination_path, "output")
if not os.path.exists(path):
    os.mkdir(path)
else:
    for each_file in os.listdir(path):
        if not each_file.endswith(".txt"):
            os.remove("{}/{}".format(path,each_file))
access_token_payload = {
    "client_id" : client_id,
    "client_secret" : client_secret
}
auth_token_url = "https://accounts.zoho.com/oauth/v2/token?"

def get_data():
    
    global access_token
    
    data_exclusion = ("_zl","s247","inode")
    query_params = {"query":query}
    
    print("Starting export")
    
    while True:
        
        print("Executing query")
        r = requests.get('https://www.site24x7.com/api/applog/search/{}/{}/1-1000/desc?'.format(start_time,end_time), headers={'Authorization': access_token},params=query_params)
        if r.status_code == 200 and (int(r.json()["data"]["ErrorCode"]) == 0):                
            
            print("Downloading...")
            
            f = open("{}/{}.csv".format(path,filename), "a")
            total = int(r.json()["data"]["numFound"])
            csv_headers = []
            for iterations in range(0,((total // 1000) + ((total % 1000) > 0))):
                response = requests.get("https://www.site24x7.com/api/applog/search/{}/{}/{}-{}/desc?".format(start_time,end_time,(iterations*1000)+1,(iterations+1)*1000), headers={'Authorization':access_token },params=query_params)
                
                if response.status_code == 200 and (int(r.json()["data"]["ErrorCode"]) == 0):
                    
                    if iterations == 0:
                        for field_key in response.json()['data']['docs'][0]:
                            if not field_key.startswith(data_exclusion):
                                csv_headers.append(field_key)
                        f.write(','.join(csv_headers)+"\n")
                    
                    for rows in response.json()['data']['docs']:
                        rowData = ''
                        for field_key in csv_headers:
                            rowData =  str(rowData) + '"' +str(rows[field_key]).replace('"','""') +'"' + ','
                        rowData = rowData.rstrip(rowData[-1])
                        f.write(rowData+"\n")
                time.sleep(1)
            f.close()
            print("Export successfully completed")
            break
        
        elif r.status_code == 401:
            get_access_token()
        
        else:
            print("Error while retrieving data")
            break

def get_access_token():
    
    global access_token
    access_token_response = requests.post(url=auth_token_url,params=access_token_payload)
    if access_token_response.status_code == 200 and ("error" not in str(access_token_response.content)):
        access_token = "Zoho-oauthtoken " + access_token_response.json()["access_token"]
    else:
        print("Error while fetching access token - ", str(access_token_response.content))
        exit()
def get_refresh_token():
    refresh_token_payload = access_token_payload.copy()
    refresh_token_payload.update({
        "grant_type" : "authorization_code",
        "code" : code
        })
    
    refresh_token_response = requests.post(url=auth_token_url,params=refresh_token_payload)
    
    if refresh_token_response.status_code == 200 and ("error" not in str(refresh_token_response.content)):
        access_token_payload["grant_type"] = "refresh_token"
        access_token_payload["refresh_token"] = refresh_token_response.json()["refresh_token"]
        rt_file = open("{}/{}.txt".format(path,"refresh_token"), "w")
        rt_file.write(access_token_payload["refresh_token"])
        rt_file.close
    else:
        print("Error while fetching refresh token - ",str(refresh_token_response.content))
        exit()
if __name__ == "__main__":  
    if not os.path.exists("{}/refresh_token.txt".format(path)):
        get_refresh_token()
    else:
        access_token_payload["grant_type"] = "refresh_token"
        access_token_payload["refresh_token"] = open("{}/refresh_token.txt".format(path), 'r').read()
    get_access_token()
    get_data()

Related articles

Was this document helpful?

Would you like to help us improve our documents? Tell us what you think we could do better.


We're sorry to hear that you're not satisfied with the document. We'd love to learn what we could do to improve the experience.


Thanks for taking the time to share your feedback. We'll use your feedback to improve our online help resources.

Shortlink has been copied!