C# HttpWebRequest "The underlying connection was closed: A connection that was expected to be kept alive was closed by the server."
C# HttpWebRequest "The underlying connection was closed: A connection that was expected to be kept alive was closed by the server."
我正在尝试构建一个需要两个请求的网络爬虫。第一个请求是 GET(创建会话),另一个是 POST(提交表单)。当我尝试提交表单时出现错误:
The underlying connection was closed: A connection that was expected to be kept alive was closed by the server.
我已经尝试使用TLS12,将keep alive设置为false,更改timeout属性,但还是不行;
堆栈跟踪:
at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
at System.Net.FixedSizeReader.ReadPacket(Byte[] buffer, Int32 offset, Int32 count)
at System.Net.Security._SslStream.StartFrameHeader(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
at System.Net.Security._SslStream.StartReading(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
at System.Net.Security._SslStream.ProcessRead(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
at System.Net.TlsStream.Read(Byte[] buffer, Int32 offset, Int32 size)
at System.Net.PooledStream.Read(Byte[] buffer, Int32 offset, Int32 size)
at System.Net.Connection.SyncRead(HttpWebRequest request, Boolean userRetrievedStream, Boolean probeRead)
打印:
我的请求函数:
public static string ObterHtmlPostTest(string url, string post, ref CookieContainer cookieContainer)
{
try
{
Encoding encoding = Encoding.UTF8;
string postData = post.ToString();
byte[] byteArray = encoding.GetBytes(postData);
string html = string.Empty;
var request = (HttpWebRequest)WebRequest.Create(url);
Stream dataStream;
StreamReader reader;
HttpWebResponse response;
NonValidatedWebHeader header = new NonValidatedWebHeader();
header.Add("Accept-Language", "pt-BR,pt;q=0.9,en-US;q=0.8,en;q=0.7");
request.Headers = header;
request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
request.CookieContainer = cookieContainer;
request.Method = "POST";
request.Accept = "*/*";
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.103 Safari/537.36";
request.Referer = url;
request.AllowAutoRedirect = true;
request.KeepAlive = true;
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = byteArray.Length;
ServicePointManager.ServerCertificateValidationCallback = delegate { return true; };
using (dataStream = request.GetRequestStream())
{
dataStream.Write(byteArray, 0, byteArray.Length);
}
response = (HttpWebResponse)request.GetResponse();
cookieContainer.Add(response.Cookies);
dataStream = response.GetResponseStream();
reader = new StreamReader(dataStream, encoding);
html = reader.ReadToEnd();
reader.Close();
dataStream.Close();
response.Close();
return html;
}
catch (Exception ex)
{
throw;
}
}
找到解决方案:
我试图抓取的网站使用 http 1.0 而不是 1.1。必须添加到代码中:
request.ProtocolVersion = HttpVersion.Version10;
我正在尝试构建一个需要两个请求的网络爬虫。第一个请求是 GET(创建会话),另一个是 POST(提交表单)。当我尝试提交表单时出现错误:
The underlying connection was closed: A connection that was expected to be kept alive was closed by the server.
我已经尝试使用TLS12,将keep alive设置为false,更改timeout属性,但还是不行; 堆栈跟踪:
at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
at System.Net.FixedSizeReader.ReadPacket(Byte[] buffer, Int32 offset, Int32 count)
at System.Net.Security._SslStream.StartFrameHeader(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
at System.Net.Security._SslStream.StartReading(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
at System.Net.Security._SslStream.ProcessRead(Byte[] buffer, Int32 offset, Int32 count, AsyncProtocolRequest asyncRequest)
at System.Net.TlsStream.Read(Byte[] buffer, Int32 offset, Int32 size)
at System.Net.PooledStream.Read(Byte[] buffer, Int32 offset, Int32 size)
at System.Net.Connection.SyncRead(HttpWebRequest request, Boolean userRetrievedStream, Boolean probeRead)
打印:
我的请求函数:
public static string ObterHtmlPostTest(string url, string post, ref CookieContainer cookieContainer)
{
try
{
Encoding encoding = Encoding.UTF8;
string postData = post.ToString();
byte[] byteArray = encoding.GetBytes(postData);
string html = string.Empty;
var request = (HttpWebRequest)WebRequest.Create(url);
Stream dataStream;
StreamReader reader;
HttpWebResponse response;
NonValidatedWebHeader header = new NonValidatedWebHeader();
header.Add("Accept-Language", "pt-BR,pt;q=0.9,en-US;q=0.8,en;q=0.7");
request.Headers = header;
request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
request.CookieContainer = cookieContainer;
request.Method = "POST";
request.Accept = "*/*";
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.103 Safari/537.36";
request.Referer = url;
request.AllowAutoRedirect = true;
request.KeepAlive = true;
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = byteArray.Length;
ServicePointManager.ServerCertificateValidationCallback = delegate { return true; };
using (dataStream = request.GetRequestStream())
{
dataStream.Write(byteArray, 0, byteArray.Length);
}
response = (HttpWebResponse)request.GetResponse();
cookieContainer.Add(response.Cookies);
dataStream = response.GetResponseStream();
reader = new StreamReader(dataStream, encoding);
html = reader.ReadToEnd();
reader.Close();
dataStream.Close();
response.Close();
return html;
}
catch (Exception ex)
{
throw;
}
}
找到解决方案:
我试图抓取的网站使用 http 1.0 而不是 1.1。必须添加到代码中:
request.ProtocolVersion = HttpVersion.Version10;