skip to Main Content

I am finally getting the hang of Python and have started using it on a daily basis at work. However, the learning curve is still steep and I have hit a roadblock in trying something new with a code I found here for scraping members from telegram channels.

Currently in lines 38-44 we can select a group from the list and it will scrape the user data into members.csv .

EDIT: Resolved the CSV naming issue:


    print('Saving In file...')
    print(target_group.title)
    filename = target_group.title 
    with open(("{}.csv".format(filename)),"w",encoding='UTF-8') as f:

Instead of relying on input, I would like to create a for loop which would iterate through every group in the list.

print('Choose a group to scrape members from:')
i=0
for g in groups:
    print(str(i) + '- ' + g.title)
    i+=1 
g_index = input("Enter a Number: ")
target_group=groups[int(g_index)]

The problem is that I am not sure exactly how to replace this part of the code with a for loop.

Although, just changing it into a for loop would make it merely overwrite the same members.csv file with each iteration, I plan on changing that so that it outputs into unique files.

So circling back to my question. How do I make this single program iteration loop through all of the groups, or just select all of them.

Thanks for the help !

2

Answers


  1. Chosen as BEST ANSWER

    Ended up figuring out the issue:

    On naming the CSV file: Used the title attribute to name the file and replacement within the string.

    g_index = chat_num
    target_group=groups[int(g_index)]
    filename = target_group.title 
    print('Fetching Members from {} ...'.format(filename))
    all_participants = []
    all_participants = client.get_participants(target_group, aggressive=True)
    
    print('Saving In file...')
    with open(("{}.csv".format(filename)),"w",encoding='UTF-8') as f:
    

    On creating a for loop for the sequence: The original code (posted in the question) did not include a for loop. My version of a workaround was to create a function from everything and then iterate through a an indexed list that was equal to the amount of instances detected. In the end looking like this:

    chat_list_index = list(range(len(chats)))
    
    for x in chat_list_index:
        try: 
            get(x)
        except:
            print("No more groups.", end = " ")
            pass
        pass
    print("Done")
    

    Overall, this might not be the best solution to accomplish what I sought out to, however its good enough for me now, and I have learned a lot. Maybe someone in the future finds this beneficial. Full code available here: (https://github.com/ivanstruk/telegram-member-scraper/).

    Cheers !


  2. Couldn’t test this, but something like this maybe? This creates a new .csv file for each group.

    for chat in chats:
        try:
            if chat.megagroup == True:
                groups.append(chat)
        except:
            continue
    
    for current_group in groups:
    
        print(f"Fetching members for group "{current_group.title}"...")
        all_participants = client.get_participants(current_group, aggressive=True)
    
        current_file_name = f"members_{current_group.title}.csv"
    
        print(f"Saving in file "{current_file_name}"...")
        with open(current_file_name, "w+", encoding="UTF-8") as file:
            writer = csv.writer(file, delimiter=",", lineterminator="n")
            writer.writerow(["username", "user id", "access hash", "name", "group", "group id"])
            for user in all_participants:
                username = user.username if user.username else ""
                first_name = user.first_name.strip() if user.first_name else ""
                last_name = user.last_name.strip() if user.last_name else ""
                name = f"{first_name} {last_name}"
                row = [username, user.id, user.access_hash, name, current_group.title, current_group.id]
                writer.writerow(row)
        print(f"Finished writing to file "{current_file_name}".")
    print("Members scraped successfully.")
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search