<?xml version="1.0" encoding="utf-8"?>
<feed xml:lang="en-us" xmlns="http://www.w3.org/2005/Atom"><title>Simon Willison's Weblog: files</title><link href="http://simonwillison.net/" rel="alternate"/><link href="http://simonwillison.net/tags/files.atom" rel="self"/><id>http://simonwillison.net/</id><updated>2025-07-17T14:31:38+00:00</updated><author><name>Simon Willison</name></author><entry><title>Quoting Terence Eden</title><link href="https://simonwillison.net/2025/Jul/17/terence-eden/#atom-tag" rel="alternate"/><published>2025-07-17T14:31:38+00:00</published><updated>2025-07-17T14:31:38+00:00</updated><id>https://simonwillison.net/2025/Jul/17/terence-eden/#atom-tag</id><summary type="html">
    &lt;blockquote cite="https://shkspr.mobi/blog/2025/07/weve-got-to-stop-sending-files-to-each-other/"&gt;&lt;p&gt;The modern workforce shouldn't be flinging copies to each other. A copy is outdated the moment it is downloaded. A copy has no protection against illicit reading. A copy can never be revoked.&lt;/p&gt;
&lt;p&gt;Data shouldn't live in a file on a laptop. It shouldn't be a single file on a network share. Data is a &lt;em&gt;living&lt;/em&gt; beast. Data needs to live in a database - not an Excel file. Access should be granted for each according to their needs.&lt;/p&gt;&lt;/blockquote&gt;
&lt;p class="cite"&gt;&amp;mdash; &lt;a href="https://shkspr.mobi/blog/2025/07/weve-got-to-stop-sending-files-to-each-other/"&gt;Terence Eden&lt;/a&gt;, We've got to stop sending files to each other&lt;/p&gt;

    &lt;p&gt;Tags: &lt;a href="https://simonwillison.net/tags/files"&gt;files&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/terence-eden"&gt;terence-eden&lt;/a&gt;&lt;/p&gt;



</summary><category term="files"/><category term="terence-eden"/></entry><entry><title>Parsing file uploads at 500 mb/s with node.js</title><link href="https://simonwillison.net/2010/Jun/2/parsing/#atom-tag" rel="alternate"/><published>2010-06-02T15:57:00+00:00</published><updated>2010-06-02T15:57:00+00:00</updated><id>https://simonwillison.net/2010/Jun/2/parsing/#atom-tag</id><summary type="html">
    
&lt;p&gt;&lt;strong&gt;&lt;a href="http://debuggable.com/posts/parsing-file-uploads-at-500-mb-s-with-node-js:4c03862e-351c-4faa-bb67-4365cbdd56cb"&gt;Parsing file uploads at 500 mb/s with node.js&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;
Handling file uploads is a real sweet spot for Node.js, especially now it has a high performance Buffer API for dealing with binary chunks of data. Felix Geisendörfer has released a new library called “formidable” which makes receiving file uploads (including HTML5 multiple uploads) easy, and uses some clever algorithmic tricks to dramatically speed up the processing of multipart data.


    &lt;p&gt;Tags: &lt;a href="https://simonwillison.net/tags/binary"&gt;binary&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/buffers"&gt;buffers&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/html5"&gt;html5&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/javascript"&gt;javascript&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/nodejs"&gt;nodejs&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/uploads"&gt;uploads&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/recovered"&gt;recovered&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/felixgeisendorfer"&gt;felixgeisendorfer&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/files"&gt;files&lt;/a&gt;&lt;/p&gt;



</summary><category term="binary"/><category term="buffers"/><category term="html5"/><category term="javascript"/><category term="nodejs"/><category term="uploads"/><category term="recovered"/><category term="felixgeisendorfer"/><category term="files"/></entry></feed>