2024年11月29日15、XCTF Training-WWW-Robots 一打开网站就看到这行字In this little training challenge, you are going to learn about the Robots_exclusion_standard.The robots.txt file is used by web crawlers to check if they are allowed to crawl and index your website or only parts of it.Sometimes these fil...
2025年4月25日原创利用python脚本删除txt文件每行后4个字符,并换行 利用python脚本删除txt文件每行后4个字符,并换行# import osfilename = r"123.txt"new_filename = r"1234.txt"with open(filename,encoding="utf-8") as f1, open(new_filename,"w",encoding="utf-8") as f2: for line in f... ...