By Freddo411


2008-09-30 21:07:06 8 Comments

I am programmatically exporting data (using PHP 5.2) into a .csv test file.
Example data: Numéro 1 (note the accented e). The data is utf-8 (no prepended BOM).

When I open this file in MS Excel is displays as Numéro 1.

I am able to open this in a text editor (UltraEdit) which displays it correctly. UE reports the character is decimal 233.

How can I export text data in a .csv file so that MS Excel will correctly render it, preferably without forcing the use of the import wizard, or non-default wizard settings?

22 comments

@daniels 2008-09-30 21:33:51

select UTF-8 enconding when importing. if you use Office 2007 this is where you chose it : right after you open the file.

@Freddo411 2008-09-30 21:45:33

This is useful. I have modified the question to ask how to do this without resorting to the wizard

@James Baker 2008-09-30 21:30:17

A correctly formatted UTF8 file can have a Byte Order Mark as its first three octets. These are the hex values 0xEF, 0xBB, 0xBF. These octets serve to mark the file as UTF8 (since they are not relevant as "byte order" information).1 If this BOM does not exist, the consumer/reader is left to infer the encoding type of the text. Readers that are not UTF8 capable will read the bytes as some other encoding such as Windows-1252 and display the characters  at the start of the file.

There is a known bug where Excel, upon opening UTF8 CSV files via file association, assumes that they are in a single-byte encoding, disregarding the presence of the UTF8 BOM. This can not be fixed by any system default codepage or language setting. The BOM will not clue in Excel - it just won't work. (A minority report claims that the BOM sometimes triggers the "Import Text" wizard.) This bug appears to exist in Excel 2003 and earlier. Most reports (amidst the answers here) say that this is fixed in Excel 2007 and newer.

Note that you can always* correctly open UTF8 CSV files in Excel using the "Import Text" wizard, which allows you to specify the encoding of the file you're opening. Of course this is much less convenient.

Readers of this answer are most likely in a situation where they don't particularly support Excel < 2007, but are sending raw UTF8 text to Excel, which is misinterpreting it and sprinkling your text with à and other similar Windows-1252 characters. Adding the UTF8 BOM is probably your best and quickest fix.

If you are stuck with users on older Excels, and Excel is the only consumer of your CSVs, you can work around this by exporting UTF16 instead of UTF8. Excel 2000 and 2003 will double-click-open these correctly. (Some other text editors can have issues with UTF16, so you may have to weigh your options carefully.)


* Except when you can't, (at least) Excel 2011 for Mac's Import Wizard does not actually always work with all encodings, regardless of what you tell it. </anecdotal-evidence> :)

@Freddo411 2008-09-30 21:51:44

Adding a BOM appears to encourage Excel to show the import wizard. Useful, but not sufficiently elegant. I'll try the utf-16 idea.

@Triynko 2010-05-17 20:31:05

Took me forever to find where to specify the encoding. Save Dialog > Tools Button > Web Options > Encoding Tab. They sure are good at hiding such important things.

@Victor Nicollet 2011-01-24 10:28:13

Wrong: adding a BOM to an UTF-8 file loads that file correctly without requiring the import wizard in Excel 2007.

@James Baker 2011-01-24 21:15:45

Victor - which BOM? I see a couple answers here recommending the UTF16 BOM for what are actually UTF8 files?

@Danny Tuppeny 2011-03-02 17:56:33

We found the same thing as Victor says today (using Excel 2010, it's all we had available). Adding a UTF-8 BOM/Signature (EF BB BF) seemed to fix the double-clicking using the system default encoding, and correctly uses UTF8 :)

@bobince 2012-04-14 10:38:29

In general, a UTF-8-encoded file should not have a Byte Order Mark prepended. UTF-8 does not have variable byte order, and putting it there sabotages UTF-8's ASCII compatibility. There are some specific file formats that either allow or encourage a UTF-8 faux-BOM, but otherwise it should be avoided. CSV is entirely encoding-ignorant so it's anyone's guess as to whether a given tool will interpret the byte sequence 0xEF 0xBB 0xBF as an indicator of UTF-8; an invisible control character in the first cell; the characters  in the first cell; or something else entirely.

@Ian Boyd 2012-09-06 21:36:17

@bobince The only downside to omitted a Byte Order Mark from UTF8 is that nobody knows it's UTF8.

@bobince 2012-09-06 21:40:17

@Ian: Nobody knows for sure it's UTF-8 with a BOM either - 0xEF 0xBB 0xBF is a valid sequence in most legacy encodings too (hence it often being misinterpreted as ISO-8859-1 or cp1252 and displayed as ). It only helps guessing algorithms, and for file formats that specifically make allowances for it (eg XML). The downside to including a faux-BOM in UTF-8 files is you break their ASCII-compatibility (a major selling point for UTF-8) Many encoding-ignorant text tools will break faced with an unexpected leading faux-BOM.

@JB. 2013-04-24 15:19:20

BTW, it does not functions properly with Excel 2013. The .CSV (TAB or ; separator) file is read correctly but when saving it falls back to ANSI encoding and TAB separator (WTF!).

@Mooing Duck 2013-07-10 22:13:00

@bobince: UTF-8's ASCII compatability refers to things like an ASCII find(mystring, "X") working when mystring is UTF-8. It does not refer to banning non-ASCII characters. Including the BOM does not change this behavior, and won't break most encoding-ignorant text tools. UTF7 and UTF16 can fail on those encoding-ignorant calls.

@bobince 2013-07-10 22:51:46

UTF-8+BOM is not ASCII compatible because there is no string that is encoded as the same byte sequence as ASCII even for basic 7-bit chars—"X" is 0xEF,0xBB,0xBF,0x58 instead of 0x58. Encoding-ignorant text tools don't like seeing three bytes of random unparseable crud at the start of their input and usually do fail; this has plagued shell commands and scripts in particular. It is true that UTF-7 and UTF-16 also fail in this situation, which is why UTF-16 is rarely used outside of memory storage and UTF-7 is not used at all. UTF-8 is supposed to be the portable alternative; BOM wrecks that.

@Leon Timmermans 2013-09-27 13:03:45

Modified the answer to clarify that BOM is 3 octets, not three characters (terminology is a total PITA in this field), and that the legacy encoding Excel isn't ASCII but something else (on western machines Windows-1252. ASCII is a 7 bit encoding.

@Sebastian 2013-11-09 21:15:09

My findings are that Excel will open a UTF-8 BOM csv correctly, but can easily garble the chosen delimiter. If on successful opening and displaying, it still managed to destroy the output on saving. The only way I could prevent that was using UTF-16LE and t as delimiter

@me1111 2013-11-28 10:35:53

UTF-16LE+it's BOM worked for me too and solved the encoding problem when opening csv in Excel by double-click. But for me tab as a delimiter (instead of comma) is unwanted behaviour. Is there any way to change this?

@Casey 2015-03-05 17:27:47

Curiously I am using Excel 2013 but did not find that the BOM fix worked for me.

@Casey 2015-03-05 17:41:06

Ah ha... it turns out that having sep=, at the beginning of the file actually breaks this trick.

@Basj 2015-04-22 08:47:34

How to quickly add the BOM to an existing .csv file with a text editor, e.g. Sublime Text?

@mklement0 2016-01-13 22:28:08

@Triynko: That's good to know, but it's important to note that this encoding only applies when saving in Web Page format - not as CSV.

@Jonathan Lidbeck 2016-11-07 04:16:15

@Basj Open .csv in Sublime Text, hit File > Save with Encoding > UTF-8 with BOM.

@AJ_83 2017-03-07 15:48:13

Is there a solution for the "sep=," problem?

@nonzaprej 2018-01-08 11:19:28

Yep, even if you put the BOM, if you use sep= to specify the separator Excel will still not use the right encoding. I just tried using ; as the separator and now the special characters are correctly displayed. I'm using Excel 2013, btw.

@beep_check 2018-03-23 14:27:33

Yes! I'm exporting a pandas dataframe with Portuguese characters to csv. While Linux seems to know what to do with utf-8, in order to open easiliy in Excel I had to use utf-16 encoding. Thanks!

@osullic 2018-11-01 10:31:45

But, as the OP asks, if you are "programmatically exporting data (using PHP 5.2)", how do you get 0xEF,0xBB,0xBF into the file in the first instance? What would this string look like in my source code, before I pass it to file_put_contents()?

@osullic 2018-11-01 10:42:40

With a few more minutes research, it looks like it should be: $bom = "\xEF\xBB\xBF"; Surprising to me that nobody actually said this anywhere already, since that was the OP's question.

@Christiaan Westerbeek 2013-08-22 08:26:09

The answer for all combinations of Excel versions (2003 + 2007) and file types

Most other answers here concern their Excel version only and will not necessarily help you, because their answer just might not be true for your version of Excel.

For example, adding the BOM character introduces problems with automatic column separator recognition, but not with every Excel version.

There are 3 variables that determines if it works in most Excel versions:

  • Encoding
  • BOM character presence
  • Cell separator

Somebody stoic at SAP tried every combination and reported the outcome. End result? Use UTF16le with BOM and tab character as separator to have it work in most Excel versions.

You don't believe me? I wouldn't either, but read here and weep: http://wiki.sdn.sap.com/wiki/display/ABAP/CSV+tests+of+encoding+and+column+separator

@Casey 2015-03-05 17:37:19

Why not just add sep=, or whatever you want to use? If you're already adding the BOM I assume you're not averse to adding stuff to the file.

@Casey 2015-03-05 17:49:47

Well, actually, to answer my own question, you wouldn't add the field separator declaration because it causes this trick to stop working. So basically it's garbled encoding or your file not being interpreted properly as a CSV if your users have the wrong region settings.

@zhaozhi 2015-05-29 13:44:20

utf-16le + BOM(0xFF 0xFE) + tab is the best

@Antonio Bardazzi 2011-10-14 08:29:31

With Ruby 1.8.7 I encode every field to UTF-16 and discard BOM (maybe).

The following code is extracted from active_scaffold_export:

<%                                                                                                                                                                                                                                                                                                                           
      require 'fastercsv'                                                                                                                                                                                                                                                                                                        
      fcsv_options = {                                                                                                                                                                                                                                                                                                           
        :row_sep => "\n",                                                                                                                                                                                                                                                                                                        
        :col_sep => params[:delimiter],                                                                                                                                                                                                                                                                                          
        :force_quotes => @export_config.force_quotes,                                                                                                                                                                                                                                                                            
        :headers => @export_columns.collect { |column| format_export_column_header_name(column) }                                                                                                                                                                                                                                
      }                                                                                                                                                                                                                                                                                                                          

      data = FasterCSV.generate(fcsv_options) do |csv|                                                                                                                                                                                                                                                                           
        csv << fcsv_options[:headers] unless params[:skip_header] == 'true'                                                                                                                                                                                                                                                      
        @records.each do |record|                                                                                                                                                                                                                                                                                                
          csv << @export_columns.collect { |column|                                                                                                                                                                                                                                                                              
            # Convert to UTF-16 discarding the BOM, required for Excel (> 2003 ?)                                                                                                                                                                                                                                     
            Iconv.conv('UTF-16', 'UTF-8', get_export_column_value(record, column))[2..-1]                                                                                                                                                                                                                                        
          }                                                                                                                                                                                                                                                                                                                      
        end                                                                                                                                                                                                                                                                                                                      
      end                                                                                                                                                                                                                                                                                                                        
    -%><%= data -%>

The important line is:

Iconv.conv('UTF-16', 'UTF-8', get_export_column_value(record, column))[2..-1]

@Marc Carlucci 2009-10-30 08:50:24

Below is the PHP code I use in my project when sending Microsoft Excel to user:

  /**
   * Export an array as downladable Excel CSV
   * @param array   $header
   * @param array   $data
   * @param string  $filename
   */
  function toCSV($header, $data, $filename) {
    $sep  = "\t";
    $eol  = "\n";
    $csv  =  count($header) ? '"'. implode('"'.$sep.'"', $header).'"'.$eol : '';
    foreach($data as $line) {
      $csv .= '"'. implode('"'.$sep.'"', $line).'"'.$eol;
    }
    $encoded_csv = mb_convert_encoding($csv, 'UTF-16LE', 'UTF-8');
    header('Content-Description: File Transfer');
    header('Content-Type: application/vnd.ms-excel');
    header('Content-Disposition: attachment; filename="'.$filename.'.csv"');
    header('Content-Transfer-Encoding: binary');
    header('Expires: 0');
    header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
    header('Pragma: public');
    header('Content-Length: '. strlen($encoded_csv));
    echo chr(255) . chr(254) . $encoded_csv;
    exit;
  }

UPDATED: Filename improvement and BUG fix correct length calculation. Thanks to TRiG and @ivanhoe011

@Russell G 2012-01-10 23:49:58

I tried several other suggestions on this page, but this worked for me in Excel 2007. The most important changes were to use tabs instead of commas (even though it's a .csv file) and the line above that echos the two characters followed by the call to mb_convert_encoding(). I also had to recompile PHP with --enable-mbstring to get support for mb_convert_encoding(). Thanks!

@kasimir 2012-03-20 13:07:34

This worked well for me too, thanks. However, in Safari I get an error in my console 'Resource interpreted as document but transferred as...' I guess it's a WebKit quirk, judging stackoverflow.com/questions/3899426/…, but perhaps it's not and/or someone has found a solution. Furthermore, in your example I would suggest a change: 'Content-Disposition: attachment; filename="'.$filename.'.csv"' because Firefox wants the double quotes, or else it will cut off your filename after a space.

@TRiG 2013-03-12 14:32:12

Why are you outputting CSV (text/csv) but calling it Excel (application/vnd.ms-excel)?

@TRiG 2013-03-12 14:34:56

Note also further comment at stackoverflow.com/a/10969702/209139.

@Jonathan 2013-07-01 00:29:19

This works great! I can confirm it's working on Mac as well (in Office 2011).

@Rich Bradshaw 2015-09-30 15:20:58

Shouldn't this be header('Content-Length: '. mb_strlen($encoded_csv, 'UTF-16LE')); ?

@Christophe GRISON 2013-04-25 16:54:44

open the file csv with notepad++ clic on Encode, select convert to UTF-8 (not convert to UTF-8(without BOM)) Save open by double clic with excel Hope that help Christophe GRISON

@Joe W 2013-04-25 17:15:06

This doesn't answer the question as its supposed to be done programmatically and not require user intervention to manually re-save every file

@Johal 2011-03-30 16:26:55

Echo UTF-8 BOM before outputing CSV data. This fixes all character issues in Windows but doesnt work for Mac.

echo "\xEF\xBB\xBF";

It works for me because I need to generate a file which will be used on Windows PCs only.

@Christiaan Westerbeek 2014-06-13 09:38:34

Not true for every type of column separator nor every Excel version. Read my answer below (below for now).

@Fred Reillier 2011-11-28 16:35:23

I've found a way to solve the problem. This is a nasty hack but it works: open the doc with Open Office, then save it into any excel format; the resulting .xls or .xlsx will display the accentuated characters.

@Christiaan Westerbeek 2013-08-22 08:29:50

The OP says he's programmatically exporting, so he's not looking for a solution that needs manual intervention.

@Ned Martin 2012-07-20 04:26:02

Note that including the UTF-8 BOM is not necessarily a good idea - Mac versions of Excel ignore it and will actually display the BOM as ASCII… three nasty characters at the start of the first field in your spreadsheet…

@bobjones 2018-04-18 17:07:20

I know this comment is 6 years later, but FWIW: Using JavaScript to download a file like '\uFEFF' + myCsvString works as expected on Mac Excel 15.19.1 (2016).

@gerald dol 2012-02-18 00:03:28

UTF-8 doesn't work for me in office 2007 without any service pack, with or without BOM (U+ffef or 0xEF,0xBB,0xBF , neither works) installing sp3 makes UTF-8 work when 0xEF,0xBB,0xBF BOM is prepended.

UTF-16 works when encoding in python using "utf-16-le" with a 0xff 0xef BOM prepended, and using tab as seperator. I had to manually write out the BOM, and then use "utf-16-le" rather then "utf-16", otherwise each encode() prepended the BOM to every row written out which appeared as garbage on the first column of the second line and after.

can't tell whether UTF-16 would work without any sp installed, since I can't go back now. sigh

This is on windows, dunno about office for MAC.

for both working cases, the import works when launching a download directly from the browser and the text import wizard doesn't intervence, it works like you would expect.

@Adam 2013-05-25 01:22:30

Works on Excel 2011 for Mac too.

@zhaozhi 2015-05-29 13:43:24

thank you for your post, use utf-16le is ok even when you didn't install office 2007 sp3, but the BOM should be 0xFF 0xFE

@Johann 2011-12-13 11:06:04

If you have legacy code in vb.net like I have, the following code worked for me:

    Response.Clear()
    Response.ClearHeaders()
    Response.ContentType = "text/csv"
    Response.Expires = 0
    Response.AddHeader("Content-Disposition", "attachment; filename=export.csv;")
    Using sw As StreamWriter = New StreamWriter(Context.Response.OutputStream, System.Text.Encoding.Unicode)
        sw.Write(csv)
        sw.Close()
    End Using
    Response.End()

@Benjol 2009-06-05 20:06:12

You can save an html file with the extension 'xls' and accents will work (pre 2007 at least).

Example: save this (using Save As utf8 in Notepad) as test.xls:

<html>
<meta http-equiv="Content-Type" content="text/html" charset="utf-8" />
<table>
<tr>
  <th>id</th>
  <th>name</th>
</tr>
<tr>
 <td>4</td>
 <td>Hélène</td>
</tr>
</table>
</html>

@Sebastian Sastre 2011-11-24 12:18:03

interesting option. It opens the text right but for some reason all the page is completely white. Without the classic spreadsheet lines delimiting rows and columns (office for mac)

@Benjol 2011-11-24 12:25:26

Yup, same thing in Office 2007 on Windows. It's always surprised me that it worked at all, to be honest. (Note, if you add border="1" to the table, you do get lines, but just around the 4 cells :)

@creechy 2011-10-19 15:44:32

Another solution I found was just to encode the result as Windows Code Page 1252 (Windows-1252 or CP1252). This would be done, for example by setting Content-Type appropriately to something like text/csv; charset=Windows-1252 and setting the character encoding of the response stream similarly.

@Sebastian Sastre 2011-11-24 13:44:18

Thanks for this one. Works on excel windows and mac. I'm using it.

@Tom McClure 2012-06-07 20:02:17

This would only work if your non-ascii character range falls entirely within Windows-1252. So for example, no korean/chinese/japanese, no cyrillic, etc. But I guess you'll slide by with this for most western european languages.

@Lukas Batteau 2011-05-26 13:16:14

Writing a BOM to the output CSV file actually did work for me in Django:

def handlePersoonListExport(request):
    # Retrieve a query_set
    ...

    template = loader.get_template("export.csv")
    context = Context({
        'data': query_set,
    })

    response = HttpResponse()
    response['Content-Disposition'] = 'attachment; filename=export.csv'
    response['Content-Type'] = 'text/csv; charset=utf-8'
    response.write("\xEF\xBB\xBF")
    response.write(template.render(context))

    return response

For more info http://crashcoursing.blogspot.com/2011/05/exporting-csv-with-special-characters.html Thanks guys!

@tsauerwein 2011-09-21 10:00:23

Yes, this worked for me with Excel 2010. In Java use printWriter.print('\ufeff'), see also How to add a UTF-8 BOM in java.

@Manfred Stienstra 2009-12-09 13:22:14

I can only get CSV to parse properly in Excel 2007 as tab-separated little-endian UTF-16 starting with the proper byte order mark.

@user203319 2009-11-05 09:56:03

Excel 2007 properly reads UTF-8 with BOM (EF BB BF) encoded csv.

Excel 2003 (and maybe earlier) reads UTF-16LE with BOM (FF FE), but with TABs instead of commas or semicolons.

@John Machin 2009-06-12 03:01:12

I've also noticed that the question was "answered" some time ago but I don't understand the stories that say you can't open a utf8-encoded csv file successfully in Excel without using the text wizard.

My reproducible experience: Type Old MacDonald had a farm,ÈÌÉÍØ into Notepad, hit Enter, then Save As (using the UTF-8 option).

Using Python to show what's actually in there:

>>> open('oldmac.csv', 'rb').read()
'\xef\xbb\xbfOld MacDonald had a farm,\xc3\x88\xc3\x8c\xc3\x89\xc3\x8d\xc3\x98\r\n'
>>> ^Z

Good. Notepad has put a BOM at the front.

Now go into Windows Explorer, double click on the file name, or right click and use "Open with ...", and up pops Excel (2003) with display as expected.

@Cocowalla 2010-12-18 17:52:13

I just tried this, and in Excel I see: ÈÌÉÃØ

@John Machin 2010-12-19 21:47:55

@Cocowalla: Well, I just tried this (again; I did test it before posting) and it worked with Excel 2007 (which is what I'm using now). Did you do open('oldmac.csv', 'rb').read() to verify your input?

@Cocowalla 2010-12-22 10:41:54

I didn't try with Excel 2007 (I know Excel 2007 reads UTF-8 files with a BOM just fine), I tried with Excel 2003

@John Machin 2010-12-22 23:57:22

@Cocowalla: Well it worked for me with Excel 2003 when I had it. Are you sure you have the latest service pack for Excel 2003? Did you verify you input as I suggested?

@Cocowalla 2010-12-23 08:20:42

I did verify that notepad had stuck a BOM at the start of the file, but I'm on Excel 2003 SP2 (SP3 is available) - so I guess this only works in SP3

@Kristof Neirynck 2009-02-27 17:29:40

As Fregal said \uFEFF is the way to go.

<%@LANGUAGE="JAVASCRIPT" CODEPAGE="65001"%>
<%
Response.Clear();
Response.ContentType = "text/csv";
Response.Charset = "utf-8";
Response.AddHeader("Content-Disposition", "attachment; filename=excelTest.csv");
Response.Write("\uFEFF");
// csv text here
%>

@Christiaan Westerbeek 2013-08-22 08:31:32

Just watch and see how your tab separator is ignored in Excel 2007 when you use BOM. You have to come up with something more.

@Fergal 2008-11-24 17:11:18

Prepending a BOM (\uFEFF) worked for me (Excel 2007), in that Excel recognised the file as UTF-8. Otherwise, saving it and using the import wizard works, but is less ideal.

@haridsv 2010-04-30 22:57:11

It still opens the text import wizard, so the difference is that you can simply double click, so still not ideal but the only known solution anyway.

@Victor Nicollet 2011-01-24 10:28:51

For me, no import wizard appears with Excel 2007.

@Danny Tuppeny 2011-03-02 17:57:10

No import wizard for me either - it works as expected if a UTF8 BOM/Signature (EF BB BF) is present.

@Hut8 2013-05-29 16:32:35

You'll need SP3 with Excel 2007 to make this work.

@Alastair McCormack 2015-01-07 13:51:49

Also, \ufeff is a UTF-16 (BE) BOM not a UTF-8 BOM

@Dave Burt 2015-01-13 00:38:00

No, @AlastairMcCormack, it's either, depending on how it's encoded. "\ufeff" encoded as UTF-8 is exactly EF BB BF. (Encoded as UTF-16 it will be just two bytes.)

@Alastair McCormack 2015-01-13 11:19:43

Good point @DaveBurt. I was thinking hex literals not Unicode literals.

@Jeff Yates 2008-09-30 21:30:01

The CSV format is implemented as ASCII, not unicode, in Excel, thus mangling the diacritics. We experienced the same issue which is how I tracked down that the official CSV standard was defined as being ASCII-based in Excel.

@spoulson 2009-07-01 18:28:19

Actually, CSV is not bound to a specific encoding. It's Excel that's assuming ASCII. en.wikipedia.org/wiki/Comma-separated_values

@Jeff Yates 2009-07-01 18:42:33

That's what I said. "implemented as ASCII in Excel", "CSV defined as ASCII-based in Excel". Not sure what point you're making as you appear to be agreeing with me.

@RichardOD 2009-10-13 13:35:58

Actually you say "The CSV format is implemented as ASCI", I think that is where the confusion stems from.

@albertein 2008-09-30 21:12:25

Check the encoding in which you are generating the file, to make excel display the file correctly you must use the system default codepage.

Wich language are you using? if it's .Net you only need to use Encoding.Default while generating the file.

@Freddo411 2008-09-30 21:18:02

The export data is utf-8. I am writing the export file with php 5

@albertein 2008-09-30 21:22:28

Transcode the data to Windows-1252 codepage, i'm not sure how to acomplish it with php

@Adam Rosenfield 2008-09-30 21:11:09

This is just of a question of character encodings. It looks like you're exporting your data as UTF-8: é in UTF-8 is the two-byte sequence 0xC3 0xA9, which when interpreted in Windows-1252 is é. When you import your data into Excel, make sure to tell it that the character encoding you're using is UTF-8.

@Freddo411 2008-09-30 21:17:23

I've confirmed that the data is UTF-8. What do I put into the file to let excel know that my data is utf-8 (BOM?)

@albertein 2008-09-30 21:19:25

I think that you need to change the file encoding, excel uses the system default codepage to handle csv files

@Mike F 2008-09-30 21:20:48

A BOM might do the trick, yep.

@Adam Rosenfield 2008-09-30 21:21:15

I'm not entirely sure, since I don't have Excel installed on the machine I'm currently using, but with OpenOffice, there's a dropdown box for character encoding when you import a CSV file. From there, choose Unicode (UTF-8).

@albertein 2008-09-30 21:23:43

Excel doesn't have the dropdown AFAIK

Related Questions

Sponsored Content

29 Answered Questions

[SOLVED] How to output MySQL query results in CSV format?

  • 2008-12-10 15:59:51
  • MCS
  • 1044067 View
  • 1042 Score
  • 29 Answer
  • Tags:   mysql csv quotes

7 Answered Questions

[SOLVED] How to correctly display .csv files within Excel 2013?

36 Answered Questions

[SOLVED] Excel to CSV with UTF8 encoding

25 Answered Questions

[SOLVED] Is it possible to force Excel recognize UTF-8 CSV files automatically?

  • 2011-05-14 13:53:39
  • Lyubomyr Shaydariv
  • 432967 View
  • 379 Score
  • 25 Answer
  • Tags:   excel csv utf-8

42 Answered Questions

[SOLVED] How to create Excel (.XLS and .XLSX) file in C# without installing Ms Office?

  • 2008-09-29 22:30:28
  • mistrmark
  • 997859 View
  • 1732 Score
  • 42 Answer
  • Tags:   c# .net excel file-io

30 Answered Questions

[SOLVED] How can I output a UTF-8 CSV in PHP that Excel will read properly?

1 Answered Questions

[SOLVED] CSV in UTF-8 and Microsoft Excel

  • 2017-01-25 10:22:49
  • user2717436
  • 1071 View
  • 0 Score
  • 1 Answer
  • Tags:   excel csv

1 Answered Questions

[SOLVED] Excel not displaying Spanish accents

0 Answered Questions

CSV exported from browser not displaying special character correctly on Excel

16 Answered Questions

[SOLVED] Excel CSV - Number cell format

Sponsored Content