2020年3月9日robots.txt文件采用了非常简单的, 面向行的语法。robots.txt文件中有三种类型的 行: 空行、注释行和规则行。规则行看起来就像HTIP首部(<Field>:<value>) 一样, 用于模式匹配。比如: # this robots.txt file allows Slurp & Webcrawler to crawl # the public parts of our site, but no other robots .....
txt_file.write(f'\n{pic}{name} TV' + ',#genre#\n')for i in range(len(lines)): line = lines[i].strip() # print(line) if line.startswith("#EXTINF:-1"): next_line = lines[i + 1].strip() if i + 1 < len(lines) else None if next_line and next_line.startswith("http...