[eluser]fuchong[/eluser]
Hello,
So I've written an API that's in charge of parsing through some POST data. In the POST data there is a docx file that has been read into a string by using PHP's stream_get_contents. The issue that I'm having is that CI is filtering the POST content even if I have xss_filtering set to false. Therefor when I try to write the contents to a file I get a corrupted docx file.
I've taken the core parts of this application and set up a non-CI environment and it works just fine - due to it's not being filtered.
I know that CI is filtering the data for security purposes, but is there a way to stop CI from doing this?
Even if it's possible feel that this is in bad practice. What we're trying to do is have a third party client send us a bunch of POST data and a file. Is there a better way to accomplish this?
Thanks!
Here's the code for what I'm trying to do
Read in the file and send it off
Code:
$handle = fopen("somefile.docx", "rb");
$contents = stream_get_contents($handle);
fclose($handle);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://some_server_address/index.php/some_controller");
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_POSTFIELDS, array('file'=>$contents));
$response = curl_exec($ch);
curl_close ($ch);
Get the file and make a new file out of it
Code:
$ourFileHandle = fopen("myFile.docx", 'w') or die("can't open file");
fwrite($ourFileHandle, $this->input->post('file', FALSE));
fclose($ourFileHandle);