Skip to content

Commit 5ad6d4c

Browse files
Abel Milashclaude
andcommitted
Remove implementation details from dataframe.py tips
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
1 parent 41b02cd commit 5ad6d4c

1 file changed

Lines changed: 8 additions & 10 deletions

File tree

src/PowerPlatform/Dataverse/operations/dataframe.py

Lines changed: 8 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -178,11 +178,10 @@ def create(
178178
IDs does not match the number of input rows.
179179
180180
.. tip::
181-
For DataFrames with more than 1,000 rows, the underlying
182-
``CreateMultiple`` call is split into sequential chunks. This is
183-
**not atomic** — if a later chunk fails, earlier rows are already
184-
committed. Callers that require atomicity should limit DataFrames
185-
to ≤ 1,000 rows per call.
181+
For DataFrames with more than 1,000 rows, the operation is split
182+
into sequential chunks. This is **not atomic** — if a later chunk
183+
fails, earlier rows are already committed. Callers that require
184+
atomicity should limit DataFrames to ≤ 1,000 rows per call.
186185
187186
Example:
188187
Create records from a DataFrame::
@@ -255,11 +254,10 @@ def update(
255254
rows are never skipped.
256255
257256
.. tip::
258-
For DataFrames with more than 1,000 rows, the underlying
259-
``UpdateMultiple`` call is split into sequential chunks. This is
260-
**not atomic** — if a later chunk fails, earlier rows are already
261-
committed. Callers that require atomicity should limit DataFrames
262-
to ≤ 1,000 rows per call.
257+
For DataFrames with more than 1,000 rows, the operation is split
258+
into sequential chunks. This is **not atomic** — if a later chunk
259+
fails, earlier rows are already committed. Callers that require
260+
atomicity should limit DataFrames to ≤ 1,000 rows per call.
263261
264262
Example:
265263
Update records with different values per row::

0 commit comments

Comments
 (0)