Yes, setting Sin[x y] ==1 gives contradictory equations for the Lagrange multiplier method:
In[8]:= Reduce[lg /. \[Alpha] -> 1]
Out[8]= False
I omitted that in the interest of brevity.
I don't understand your statement that this is not an issue with infinity. It seems to me that the problem is difficult for alpha =1 because the Lagrange multiplier goes to infinity. If I run the problem using COIN-OR's Interior Point solver IPOPT, I get a very high value for the Lagrange multiplier.
In[10]:= callIpOpt[
x^2 + y^2, {Sin[x y] == 1}, {{x, -2, 1, 2}, {y, -2, 1, 2}}]
Out[10]= {3.14159, {x -> 1.25331, y -> 1.25331},
"cons values" -> {1.},
"var lb \[Lambda]s" -> {7.70262*10^-10, 7.70262*10^-10},
"var ub \[Lambda]s" -> {3.35603*10^-9, 3.35603*10^-9},
"cons \[Lambda]s" -> {-1.15281*10^7}, "Solve_Succeeded"}
If I use EvaluationMonitor to look at what FindMinimum is doing internally, I see that it actually does come close to the solution.
In[11]:= resrs =
Reap @ FindMinimum[{x^2 + y^2, Sin[x y] == 1}, {{x, 1}, {y, 1}},
EvaluationMonitor :> Sow[{x, y}]];
During evaluation of In[11]:= FindMinimum::eit: The algorithm does not converge to the tolerance of 4.806217383937354`*^-6 in 500 iterations. The best estimated solution, with feasibility residual, KKT residual, or complementary residual of {0.158529,0.0293408,0}, is returned. >>
In[15]:= resrs[[2, -1, -2]]
Out[15]= {1.25331, 1.25331}
But apparently throws it out because it hasn't met the convergence criteria.