The last suggestion proposed by Rohit has solved a problem I am/was having.
My problem is the following. I have a csv file that contains the typical stock data (Symbol, Sector, Date, Open, High, Low, Close, Volume) and a few indicators (e.g. Relative Strength, moving average).
There are 49 columns of which the Symbol and Sector are Strings.
The 3rd columns ("Date") a date represented as "Month/Day/Year".
When I used SemanticImport as in:
rawData = SemanticImport[filename, HeaderLines -> 1]
Things worked well until the file size got to about 200 rows. I first noticed the problem when I tried to do the SemanticImport on a file of 257 rows with 49 columns (as mentioned above). The cell where I executed the code above simply ran for ever (meaning hours) until I had to abort the evaluation.
However, if I was doing the same thing with a file of 128 rows, the SemanticImport worked and returned a Dataset.
Not knowing what was wrong, I thought that there might be a bad field in the file but simply using:
rawData = Import[filename,"Dataset",HeaderLines->1]
data = rawData[All, <|#, "Date" -> DateObject[DateList[{#Date, {"Month","Day", "YearShort"}}]]|> &]
Worked without a hitch. So, I figured that the size of the file was not a problem since Import[] was able to load the data and I could then convert all 256 dates strings to date objects.
But SemanticImport simply stopped working once the file had about 200 rows.
Though your suggestion has allowed me to solve my problem, I still would like to understand why SemanticImport stopped working once my file reached a certain size whereas Import[] was able to read the file without any issues.