openhub.net
Black Duck Software, Inc.
Open Hub
Follow @
OH
Sign In
Join Now
Projects
People
Organizations
Tools
Blog
BDSA
Projects
People
Projects
Organizations
Forums
Apache Spark
Settings
|
Report Duplicate
56
I Use This!
×
Login Required
Log in to Open Hub
Remember Me
Very High Activity
Commits
: Listings
Analyzed
about 14 hours
ago. based on code collected
about 15 hours
ago.
Nov 25, 2024 — Nov 25, 2025
Showing page 1 of 1,541
Search / Filter on:
Commit Message
Contributor
Files Modified
Lines Added
Lines Removed
Code Location
Date
[SPARK-54491][SQL] Fix insert into temp view on DSv2 table failure
manuzhang
More...
about 16 hours ago
[SPARK-54221][FOLLOW-UP] Add `pypy3.11` in `Build Pipeline Status`
Ruifeng Zheng
More...
about 17 hours ago
[SPARK-54247][PYTHON] Explictly close socket for util._load_from_socket
Tian Gao
More...
about 17 hours ago
[SPARK-54219][CORE] Support `spark.cleaner.referenceTracking.blocking.timeout` config
Angerszhuuuu
More...
about 17 hours ago
[SPARK-54397][PYTHON] Make `UserDefinedType` hashable
Ruifeng Zheng
More...
about 17 hours ago
[SPARK-54488][BUILD] Fix Connect JVM client leaks `protobuf-java-util` in shading rules
Cheng Pan
More...
about 18 hours ago
[SPARK-54456][PYTHON] Import worker module after fork to avoid deadlock
Tian Gao
More...
about 18 hours ago
[SPARK-54473][SQL] Add Avro read and write support for TIME type
vinodkc
More...
about 18 hours ago
[SPARK-54289][SQL][FOLLOW-UP] Make Merge Into update assignment by field default for UPDATE SET * and align configs
Szehon Ho
More...
about 18 hours ago
[SPARK-54115][TESTS][FOLLOW-UP] Refine `org.apache.spark.util.UtilsSuite`
Ruifeng Zheng
More...
1 day ago
[SPARK-54474][PYTHON] Discard the XML report on tests that are supposed to fail
Tian Gao
More...
1 day ago
[SPARK-54022][SQL] Make DSv2 table resolution aware of cached tables
Anton Okolnychyi
More...
2 days ago
[SPARK-54472][SQL] Add ORC read and write support for TIME type
vinodkc
More...
2 days ago
[SPARK-54425][INFRA] Add two missing test directories to coverage omit
Tian Gao
More...
2 days ago
[SPARK-54458][INFRA] Use official download-artifact to fix the test_report action
Tian Gao
More...
2 days ago
[SPARK-54471][INFRA] Remove `build_python_3.11_macos.yml` CI
Dongjoon Hyun
More...
2 days ago
[SPARK-54470][CORE][TESTS] Fix `BlockManagerDecommissionIntegrationSuite` to wait shuffle migrations
Dongjoon Hyun
More...
2 days ago
[SPARK-54469][DSTREAM][TESTS] Fix `StreamingContextSuite.stop slow receiver gracefully` test to clean up `SprakContext`
Dongjoon Hyun
More...
2 days ago
[SPARK-54452] Fix empty response from SparkConnect server for `spark.sql(...)` inside FlowFunction
Yuheng Chang
More...
2 days ago
[SPARK-54467][INFRA][FOLLOWUP] Disable `actions/cache` on `python_hosted_runner_test.yml` too
Dongjoon Hyun
More...
2 days ago
[SPARK-54467][INFRA] Disable `actions/cache` on MacOS CIs
Dongjoon Hyun
More...
2 days ago
[SPARK-50072][SQL] Handle ArithmeticException in interval parsing with large values
AbinayaJayaprakasam
More...
2 days ago
[SPARK-54465][INFRA] Remove `build_python_connect35.yml` GitHub Action job
Dongjoon Hyun
More...
2 days ago
[SPARK-54463][SQL] Add CSV serialization and deserialization support for TIME type
vinodkc
More...
3 days ago
[SPARK-54464][SQL] Remove duplicate `output.reserve` calls in `assembleVariantBatch`
zouxxyy
More...
3 days ago
[SPARK-54412][SQL][TESTS] Clean up `v` properly in `identifier-clause.sql` SQL golden file
Vlad Rozov
More...
3 days ago
[SPARK-54364][INFRA][FOLLOWUP] Disable pyspark_install on apache master commits
Tian Gao
More...
3 days ago
[SPARK-54363][REPL][TESTS] Access to buffer in SparkShellSuite should be synchronized between threads
Vlad Rozov
More...
3 days ago
[SPARK-53635][SQL] Support Scala UDFs with input args of type Seq[Row]
jameswillis
More...
3 days ago
[SPARK-54461][SQL] Add XML serialization and deserialization support for TIME type
vinodkc
More...
3 days ago
←
1
2
3
4
5
6
7
8
9
…
1540
1541
→
This site uses cookies to give you the best possible experience. By using the site, you consent to our use of cookies. For more information, please see our
Privacy Policy
Agree